CN109444827A - A kind of orientation interpolation method shown for identification by radar echoes - Google Patents

A kind of orientation interpolation method shown for identification by radar echoes Download PDF

Info

Publication number
CN109444827A
CN109444827A CN201811290044.7A CN201811290044A CN109444827A CN 109444827 A CN109444827 A CN 109444827A CN 201811290044 A CN201811290044 A CN 201811290044A CN 109444827 A CN109444827 A CN 109444827A
Authority
CN
China
Prior art keywords
radar
point
display pixel
pixel
sampled point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811290044.7A
Other languages
Chinese (zh)
Other versions
CN109444827B (en
Inventor
李军侠
朱勇
张永泉
张昕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CSSC Systems Engineering Research Institute
Original Assignee
CSSC Systems Engineering Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CSSC Systems Engineering Research Institute filed Critical CSSC Systems Engineering Research Institute
Priority to CN201811290044.7A priority Critical patent/CN109444827B/en
Publication of CN109444827A publication Critical patent/CN109444827A/en
Application granted granted Critical
Publication of CN109444827B publication Critical patent/CN109444827B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/28Details of pulse systems
    • G01S7/285Receivers
    • G01S7/295Means for transforming co-ordinates or for evaluating data, e.g. using computers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A kind of orientation interpolation method shown for identification by radar echoes, which is characterized in that include the following steps, Step 1: the corresponding relationship of pixel and sampled point is established, according to coordinate transformation relationθp=atan (Y/X) calculates pixel (X, Y) and sampled point (R, θp) corresponding relationship, and formed storage respective distances ring radius value data form;Count the pixel number S on different distance ringR, according to N=round (log2(SR)) quantizing pixel number, i.e., to pixel number S when subsequent display is handledRAccording to 2NPoint interpolation;The present invention than existing methods, can better adapt to the Radar Products of different working modes and different wave, preferably the requirement of adaptation different direction sample rate, the especially radar of high-resolution long distance;This algorithm can preferably adapt to the radar operation mode of different systems.

Description

A kind of orientation interpolation method shown for identification by radar echoes
Technical field
The present invention is a kind of orientation interpolation method shown for identification by radar echoes, especially suitable near region near region Video flashes, Small object are easily lost;There is the visualization display of the radar video of fatal position location phenomenon in far field.
Background technique
Rotary scanning type occupies larger proportion in various radar, and spin scanning radar mostly uses plan-position display side Formula shows that the visualization display of radar video is the main appearance form of this kind of radar detection result.Plan-position display is presented Large-scale background area, forms the navigation information and situation information of target, and common radar includes pathfinder, right Sky search or surveillance radar.
Scope display mode causes the two class problems that radar video is shown: display blind area phenomenon and fatal position location Phenomenon.Display blind area phenomenon appears in radar near region, since display resolution is lower than radar sampling rate near region, causes to show The video data of near region difference pulse writes on same video memory address, and near region video flashes, Small object is caused easily to lose It loses;Fatal position location phenomenon shows as appearing in the far field of radar, and radar resolution reduces herein, and radius more big data density is lower, There is the blank pixel not covered in display pixel, black color dots are presented.
The approach that one kind solves fatal position location phenomenon is to improve the transmitting-receiving pulse frequency of radar, improves azimuth sample rate, guarantees Distant location emits pulse relative redundancy, and there is no display fatal position locations.The method radar hardware cost greatly improves, more radars Take software processing mode.
Summary of the invention
The object of the present invention is to provide a kind of orientation interpolation methods shown for identification by radar echoes, fail in radar When meeting the abundant sampling condition in orientation, scene information still can be preferably presented in plan-position display pattern.One kind is regarded for radar The orientation interpolation method that frequency echo is shown, which is characterized in that include the following steps,
Step 1: establishing the corresponding relationship of pixel and sampled point
According to coordinate transformation relationθp=atan (Y/X) calculate pixel (X, Y) with Sampled point (R, θp) corresponding relationship, and formed storage respective distances ring radius value data form;
Count the pixel number S on different distance ringR, according to N=round (log2(SR)) quantizing pixel number, i.e., it is subsequent To pixel number S when display processingRAccording to 2NPoint interpolation;
Step 2: being shown to orientation interpolation radar return
Radar pulse repetition frequency and antenna rotation rate are obtained according to radar system parameters, calculates radar according to following formula Show individual pen along the sampling number in orientation:
Wherein, PRF is radar pulse repetition frequency, and n is antenna rotation rate;
Step 3: display pixel area divides
The corresponding display pixel area in radar sampling region is divided near region, middle area and three, far field region:
Compare rang ring quantification treatment points and scan the umber of pulse to be formed with individual pen, if meeting N < round (log2 (SA)), divide the display pixel point on respective distance ring near region;
If meeting N=round (log2(SA)), then the display pixel point on respective distance ring divides middle area into;
If meeting N > round (log2(SA)), then the display pixel point on respective distance ring divides far field into, wherein N For,;round(log2(SA)) be,
Step 4: display pixel area classification processing
The down-sampled elimination redundancy near region shows data, and Variable sampling processing is not done in middle area, and far field rises sample interpolation processing display Data;
2. a kind of orientation interpolation method shown for identification by radar echoes according to claim 1, feature exist In, the step four display pixel area classification processing, specific processing is
A. near region disappear redundancy processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion (Fast Fourier Transform, FFT), frequency-region signal Using low-pass filtering treatment, retain 2 in frequency domainNrPoint, back side matching points carry out video inverse transformation, i.e., against quick Fourier Leaf transformation, IFFT;The neighbouring close selection display pixel point of sampled point angle of data after extraction;
B. middle area's display processing
Without time-frequency conversion and inversion process, the direct sampled point of radar is as the subsequent alternative point of display;Radar is straight Connect the neighbouring close selection display pixel point of sampled point angle of sampled data.
C. far field interpolation processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion, i.e. Fast Fourier Transform, FFT, frequency-region signal It is first filtered using the high-cut filter of 3.6 type selectings, high fdrequency component zero padding value is formed 2NrPoint frequency-region signal, then orientation matches Point carries out video inverse transformation, i.e., against Fast Fourier Transform, IFFT;Pass through the FFT point number curve and phase used on different radii The table answered, the neighbouring close selection display pixel point of sampled point angle of data after interpolation.
The present invention than existing methods, can better adapt to the Radar Products of different working modes and different wave, preferably Adapt to the requirement of different direction sample rate, the especially radar of high-resolution long distance;This algorithm can preferably adapt to different systems Radar operation mode.
The present invention has the advantage that
1) Radar Products of different working modes and different wave are better adapted to, preferably the sampling of adaptation different direction Rate requirement, the especially radar of high-resolution long distance;This algorithm can preferably adapt to the radar operation mode of different systems;
2) requirement of radar rotating mechanism, this algorithm workflow not point of azimuth sample caused by by revolving speed deviation are reduced Number increase and decrease influences;
3) algorithm is relatively stable, and sampled point and video pixel point correspondence are more stable, the impacted condition of radar revolving speed Lower video, which is shown, will not occur significant change;
4) algorithm is close to conventional method, and Virtual Realization azimuth sample, other subsequent processings can continue to retain;
5) radar signal mostly uses frequency domain processing technique, and the processing of radar return view interpolation uses similar techniques;Radar is aobvious Show and used for reference signal processing technology, is conducive to the coordinating and unifying planning of front and rear end.
Detailed description of the invention
Fig. 1, interpolation processing flow chart in orientation of the present invention
Fig. 2, the corresponding radius number of present invention pixel point
Interpolation points schematic diagram on Fig. 3, different distance ring of the present invention
Fig. 4, radar asorbing paint block plan of the present invention
Fig. 5-1, different distance ring FFT of the present invention points schematic diagram;
Fig. 5-2, different distance ring FFT of the present invention points list.
The neighbouring sampled point angle approaching of Fig. 6, the present invention chooses process flow diagram;
Fig. 7, (Gaussian function domain is infinite interval to present invention truncation Gaussian function figure, intercepts central area shape herein At truncation Gaussian function).
Fig. 8-1, low-pass filter of the present invention show wave figure;
Fig. 8-2, high-cut filter of the present invention show wave figure.
Specific embodiment
A kind of orientation interpolation method shown for identification by radar echoes, which is characterized in that include the following steps,
Step 1: establishing the corresponding relationship of pixel and sampled point
According to coordinate transformation relationθp=atan (Y/X) calculate pixel (X, Y) with Sampled point (R, θp) corresponding relationship, and formed storage respective distances ring radius value data form;
Count the pixel number S on different distance ringR, according to N=round (log2(SR)) quantizing pixel number, i.e., it is subsequent To pixel number S when display processingRAccording to 2NPoint interpolation;
Step 2: being shown to orientation interpolation radar return
Radar pulse repetition frequency and antenna rotation rate are obtained according to radar system parameters, calculates radar according to following formula Show individual pen along the sampling number in orientation:
Wherein, PRF is radar pulse repetition frequency, and n is antenna rotation rate;
Step 3: display pixel area divides
The corresponding display pixel area in radar sampling region is divided near region, middle area and three, far field region:
Compare rang ring quantification treatment points and scan the umber of pulse to be formed with individual pen, if meeting N < round (log2 (SA)), divide the display pixel point on respective distance ring near region;
If meeting N=round (log2(SA)), then the display pixel point on respective distance ring divides middle area into;
If meeting N > round (log2(SA)), then the display pixel point on respective distance ring divides far field into, wherein N For,;round(log2(SA)) be,
Step 4: display pixel area classification processing
The down-sampled elimination redundancy near region shows data, and Variable sampling processing is not done in middle area, and far field rises sample interpolation processing display Data;
2. a kind of orientation interpolation method shown for identification by radar echoes according to claim 1, feature exist In, the step four display pixel area classification processing, specific processing is
A. near region disappear redundancy processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion (Fast Fourier Transform, FFT), frequency-region signal Using low-pass filtering treatment, retain 2 in frequency domainNrPoint, back side matching points carry out video inverse transformation, i.e., against quick Fourier Leaf transformation, IFFT;The neighbouring close selection display pixel point of sampled point angle of data after extraction;
B. middle area's display processing
Without time-frequency conversion and inversion process, the direct sampled point of radar is as the subsequent alternative point of display;Radar is straight Connect the neighbouring close selection display pixel point of sampled point angle of sampled data.
C. far field interpolation processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion, i.e. Fast Fourier Transform, FFT, frequency-region signal It is first filtered using the high-cut filter of 3.6 type selectings, high fdrequency component zero padding value is formed 2NrPoint frequency-region signal, then orientation matches Point carries out video inverse transformation, i.e., against Fast Fourier Transform, IFFT;Pass through the FFT point number curve and phase used on different radii The table answered, the neighbouring close selection display pixel point of sampled point angle of data after interpolation.
The present invention is that the software approach of solution fatal position location phenomenon is in scanning area by data original in video memory It reads out and write-back is weighted with data to be written.
The essence of this method is that the orientation coverage density of the random error increase radar of first phase is swept using search radar ring, Reduce the blind spot of radar bearing sampling.
2R × sin (360/2N)≤12 π r, true calculated result such as figure π R2;Radar actual samples points are about 2 π R2, see Figure conventional method is constrained by hardware condition, can reduce hardware constraints using software implementation technology, simplify azimuth dimension sampled point, Equivalent, Fast Fourier Transform data Variable sampling, frequency domain information control of coordinate points position approximation etc. becomes this technology most feature Content.The actual use range of radar, which is extended, by this method progress orientation interpolation and solves radar conventional video shows Show the constraint condition by hardware.
The orientation interpolation method shown herein for identification by radar echoes is set to give a kind of adaptable orientation interpolation Software approach, when radar is not able to satisfy the abundant sampling condition in orientation, scene still can be preferably presented in plan-position display pattern Information.Rang ring is innovatively divided near region, middle area and three kinds of far field situation, by interpolation processing or extracts processing satisfaction Azimuth sample requirement on different distance ring, wherein middle area's rang ring sampling matching does not need interpolation processing, near region rang ring then Since orientation sampled point excessively takes extraction to handle, far field rang ring owes the state of adopting since sampling number is in, and passes through interpolation Sampling number can be supplemented.Radar video orientation interpolation technique flow chart is divided into radar data sampling pretreatment, data sectional Processing, multidomain treat-ment and neighbouring sampled point angle approaching choose four key steps of processing, and Fig. 1 is corresponding for orientation interpolation technique Flow chart.
3.1 establish the corresponding relationship of pixel and sampled point
According to coordinate transformation relationθp=atan (Y/X) calculate pixel (X, Y) with Sampled point (R, θp) corresponding relationship, and formed storage respective distances ring radius value data form;Physical memory regions only retain A quarter region, Fig. 2 are the corresponding radius value figure of different display pixels.
Count the pixel number S on different distance ringR, according to N=round (log2(SR)) quantizing pixel number, i.e., it is subsequent To pixel number S when display processingRAccording to 2NPoint interpolation.Fig. 3 is interpolation points schematic diagram on different distance ring.
3.2 orientation interpolation radar returns show process flow
Radar pulse repetition frequency (PRF) and antenna rotation rate (n) are obtained according to radar system parameters first, is respectively corresponded Radar pulse emission rate and scanner rotating rate calculate radar asorbing paint individual pen along the sampling number in orientation according to following formula.
3.3 display pixel areas divide
Radar sampling region division is near region, middle area and three, far field region by lower surface treatment:
Compare rang ring quantification treatment points and scan the umber of pulse to be formed with individual pen, if meeting N < round (log2 (SA)), divide the display pixel point on respective distance ring near region;
If meeting N=round (log2(SA)), then the display pixel point on respective distance ring divides middle area into;
If meeting N=round (log2(SA)), then the display pixel point on respective distance ring divides far field into.Fig. 4 is radar It shows block plan, is successively near region, middle area and far field from inside to outside.
3.4 display pixel area classification processings
Viewing area is handled respectively according to different sections, and the down-sampled elimination redundancy near region shows data, and Variable sampling is not done in middle area Processing, far field rise sample interpolation processing display data.
Near region disappear redundancy processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion (Fast Fourier Transform, FFT), frequency-region signal Using the low-pass filtering treatment of 3.6 selected parts types, retain 2 in frequency domainNrPoint, it is (inverse that back side matching points carry out video inverse transformation Fast Fourier Transform, IFFT);Data are according to 3.5 sections adjacent to the close selection display pixel point of sampled point angle after extraction.
Middle area's display processing
Without time-frequency conversion and inversion process, the direct sampled point of radar is as the subsequent alternative point of display;Radar is straight Sampled data is connect according to 3.5 sections adjacent to the close selection display pixel point of sampled point angle.
Far field interpolation processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion (Fast Fourier Transform, FFT), frequency-region signal It is first filtered using the high-cut filter of 3.6 type selectings, high fdrequency component zero padding value is formed 2NrPoint frequency-region signal, then orientation matches Point carries out video inverse transformation (inverse Fast Fourier Transform, IFFT);Fig. 5 is the FFT point number curve used on different radii and phase The table answered.Data are according to 3.5 sections adjacent to the close selection display pixel point of sampled point angle after interpolation.
3.5 adjacent to the close choosing method of sampled point angle
Neighbouring sampled point selection is handled, and is chosen neighbour's candidate point at pixel according to angular relationship and is shown.It introduces empty Quasi- bearing swing amount, the corresponding sampled point of Iterative matching pixel are allowed to error angle minimum, the corresponding display sampled point of pixel Gray value.
Fig. 6 is the flow chart that neighbouring sampled point angle approaching chooses processing technique.
3.5.1 sampled point sorts respectively with pixel
Sampled point and pixel are arranged successively according to distance resolution by rang ring radius is ascending;
Radar sampling point (R, θ are arranged in anglep), by the point on different distance ring along θpAscending is counterclockwise suitable Sequence arranges, here angle, θpValue range be 0 ° -360 ° (are free of 360 °), angular dimension is identical between sampled point, is expressed as Δθ;
Display pixel point divides the different distance ring for 1 at interval along radius, arranges pixel (X, Y) in rang ring, number Putting in order for strong point (X, Y) is descending for X, and Y is ascending;
3.5.2 matched pixel point
Initialization process: from initial display pixel point (X0, 0) start, look for θpAs t at the time of closest to 00Initial samples Point (R, θp(t0)) matching, angle, θdI.e. as the angle rotational difference of two coordinates, θdp(t0);
Lower display pixel point (X, the Y) polar coordinates of recursive calculation are (R, θ (t1)) matched sampled point, pixel meets X =Rcos θ (t1), Y=Rsin θ (t1)。
Mark { Rcos θd, Rsin θd, cos Δ θ, sin Δ θ } and it is history triangle value set, history delta value is followed in inside Constantly updated in ring until start new radius, above formula calculated and obtain by history triangle value set and X, Y.
More new historical triangle set forms new history triangle set { Rcos (θd+ Δ θ), Rsin (θd+ Δ θ), cos Δ θ, sin Δ θ }, more new formula is Rcos (θd+ Δ θ)=Rcos θdcosΔθ-RsinθdSin Δ θ, Rsin (θd+ Δ θ)= RsinθdcosΔθ+RcosθdsinΔθ.Calculate (R2cos(θ(t1)-θd), R2sin(θ(t1)-θd)), calculation formula is as follows
R2cos(θ(t1)-θd)=Rcos (θ (t1))Rcos(θd)+Rsin(θ(t1))Rsin(θd),
R2sin(θ(t1)-θd)=Rsin (θ (t1))Rcos(θd)-Rcos(θ(t1))Rsin(θd)。
Calculate whether sequentially sampled point matches display pixel point (R2cos(θ(t1)-θdΔ θ),
R2sin(θ(t1)-θdΔ θ)), calculation formula is
R2cos(θ(t1)-θdΔ θ)=Rcos (θ (t1))Rcos(θd+Δθ)
+Rsin(θ(t1))Rsin(θd+Δθ)
R2sin(θ(t1)-θdΔ θ)=Rsin (θ (t1))Rcos(θd+Δθ)
-Rcos(θ(t1))Rsin(θd+Δθ)
Same processing step calculates following radar sampling point (R, θd+ n Δ θ) and the current matching relationship for showing point, it updates History delta value collection is { cos Δ θ, sin Δ θ, Rcos (θd+ n Δ θ), Rsin (θd+nΔθ)}。
(R is calculated by delta value collection2Cos(θ(t1)-θd- n Δ θ), R2sin(θ(t1)-θd- n Δ θ)), calculation formula is as follows
R2Cos(θ(t1)-θd- n Δ θ)=Rcos (θ (t1))Rcos(θd+nΔθ)
+Rsin(θ(t1))Rsin(θd+nΔθ)
R2sin(θ(t1)-θd- n Δ θ)=Rsin (θ (t1))Rcos(θd+nΔθ)
-Rcos(θ(t1))Rsin(θd+nΔθ)
Until current value R2sin(θ(t1)-θd- n Δ θ) meet R2sin(θ(t1)-θd- n Δ θ)≤0 condition, into matching Condition judgement;
Matching judgment processing
Calculate abs (R2sin(θ(t1)-θd- n Δ θ)) and abs (R2sin(θ(t1)-θd(n-1) Δ θ)) value is lesser As a result it is used as final result, corresponding sampled point respectively corresponds (R, θd+ n Δ θ), (R, θd+ (n-1) Δ θ), it is updated to correspond to The history delta value of point.
Subsequent cycle calculating is carried out, the correspondence sampled point of coordinate (X, Y) is obtained, starting sample value is set to last time selection Sampled point;
If current sampling point meets R2The condition of sin≤0 then directly chooses currently the only point;Otherwise continue 3.5.2 into The matching primitives of the next sampled point of row.
3.5.3 lower outer ring pixel matching
Restart to calculate history triangle value set, the corresponding points for then starting lower Radius calculate.
The neighbouring close choosing method of sampled point angle optimizes calculating process, and calculating process with additive, based on multiplication, Using part constant sine value sin Δ θ and cosine value cos Δ θ, data space can be reduced.
The processing of 3.6 filter type selectings
Running parameter is arranged for controlling image displaying quality in filter when being used by user.Filter selection is divided into filter Wave device type, two class control parameter of filter bandwidht control parameter.
Filter type is set as three classes: truncation Gauss mode filter, triangle filter, rectangular filter;Truncation is high This mode filter and triangular filter are divided into the parametric form of three kinds of setting bandwidth:-three dB bandwidth, -30dB bandwidth and -60dB band It is three kinds wide;Fig. 7 and Fig. 8 is truncation gauss low frequency filter and truncation Gauss high-cut filter.
This paper presents a kind of orientation interpolation method shown for identification by radar echoes, this method has following technology Feature:
1. pixel is divided near region, middle area and three, far field region according to sampled point and pixel characteristic distributions;
2. being all made of 2 in each subregionNrPoint carries out fast Flourier and inverse fast Flourier processing technique, is suitble to number letter The processing of number software implementation;
3. devising truncation Gauss mode filter forms low-pass filter and high-cut filter, the low pass filtered for signal Wave and high resistant inhibit;
4. devising neighbouring sampled point angle approaching chooses processing technique, handled for interpolation point selection.

Claims (2)

1. a kind of orientation interpolation method shown for identification by radar echoes, characterized in that it comprises the following steps:
Step 1: establishing the corresponding relationship of pixel and sampled point
According to coordinate transformation relationθp=atan (Y/X) calculates pixel (X, Y) and sampled point (R,θp) corresponding relationship, and formed storage respective distances ring radius value data form;
Count the pixel number S on different distance ringR, according to N=round (log2(SR)) quantizing pixel number, i.e., at subsequent display To pixel number S when reasonRAccording to 2NPoint interpolation;
Step 2: being shown to orientation interpolation radar return
Radar pulse repetition frequency and antenna rotation rate are obtained according to radar system parameters, calculates radar asorbing paint list according to following formula Enclose the sampling number along orientation:
Wherein, PRF is radar pulse repetition frequency, and n is antenna rotation rate;
Step 3: display pixel area divides
The corresponding display pixel area in radar sampling region is divided near region, middle area and three, far field region:
Compare rang ring quantification treatment points and scan the umber of pulse to be formed with individual pen, if meeting N < round (log2(SA)), by phase The display pixel point on rang ring is answered to divide near region into;
If meeting N=round (log2(SA)), then the display pixel point on respective distance ring divides middle area into;
If meeting N > round (log2(SA)), then the display pixel point on respective distance ring divides far field into, wherein N is;round (log2(SA)) be,
Step 4: display pixel area classification processing
The down-sampled elimination redundancy near region shows data, and Variable sampling processing is not done in middle area, and far field rises sample interpolation processing display data.
2. a kind of orientation interpolation method shown for identification by radar echoes according to claim 1, which is characterized in that institute Four display pixel area classification processing of the step of stating, specific processing are
A. near region disappear redundancy processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion (Fast Fourier Transform, FFT), frequency-region signal is using low Pass filter processing, retains 2 in frequency domainNrPoint, back side matching points carry out video inverse transformation, i.e., against Fast Fourier Transform, IFFT;The neighbouring close selection display pixel point of sampled point angle of data after extraction;
B. middle area's display processing
Without time-frequency conversion and inversion process, the direct sampled point of radar is as the subsequent alternative point of display;Radar is directly adopted The neighbouring close selection display pixel point of sampled point angle of sample data.
C. far field interpolation processing
The sampled point of display pixel respective distances is subjected to time-frequency conversion, i.e. Fast Fourier Transform, FFT, frequency-region signal is first adopted It is filtered with the high-cut filter of 3.6 type selectings, high fdrequency component zero padding value is formed 2NrPoint frequency-region signal, then orientation match point carries out Video inverse transformation, i.e., against Fast Fourier Transform, IFFT;Pass through the FFT point number curve used on different radii and corresponding table Lattice, the neighbouring close selection display pixel point of sampled point angle of data after interpolation.
CN201811290044.7A 2018-10-31 2018-10-31 Direction interpolation method for radar video echo display Active CN109444827B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811290044.7A CN109444827B (en) 2018-10-31 2018-10-31 Direction interpolation method for radar video echo display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811290044.7A CN109444827B (en) 2018-10-31 2018-10-31 Direction interpolation method for radar video echo display

Publications (2)

Publication Number Publication Date
CN109444827A true CN109444827A (en) 2019-03-08
CN109444827B CN109444827B (en) 2023-05-30

Family

ID=65550077

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811290044.7A Active CN109444827B (en) 2018-10-31 2018-10-31 Direction interpolation method for radar video echo display

Country Status (1)

Country Link
CN (1) CN109444827B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2067375A (en) * 1980-01-09 1981-07-22 Marconi Co Ltd Improvements in or relating to radars
EP1542033A1 (en) * 2003-12-09 2005-06-15 Oerlikon Contraves Ag Method for displaying images of a surveillance space
CN103376442A (en) * 2012-04-27 2013-10-30 古野电气株式会社 Device and method for displaying information
CN104360324A (en) * 2014-10-31 2015-02-18 中国电子科技集团公司第二十八研究所 Clutter map partitioning method based on image processing
US20150123835A1 (en) * 2012-05-17 2015-05-07 Deep Imaging Technologies Inc. System and Method Using Near and Far Field ULF and ELF Interferometry Synthetic Aperture Radar for Subsurface Imaging
CN104914429A (en) * 2015-05-19 2015-09-16 西安电子科技大学 Target indication radar system capable of adaptively selecting waveform according to target distance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2067375A (en) * 1980-01-09 1981-07-22 Marconi Co Ltd Improvements in or relating to radars
EP1542033A1 (en) * 2003-12-09 2005-06-15 Oerlikon Contraves Ag Method for displaying images of a surveillance space
CN103376442A (en) * 2012-04-27 2013-10-30 古野电气株式会社 Device and method for displaying information
US20150123835A1 (en) * 2012-05-17 2015-05-07 Deep Imaging Technologies Inc. System and Method Using Near and Far Field ULF and ELF Interferometry Synthetic Aperture Radar for Subsurface Imaging
CN104360324A (en) * 2014-10-31 2015-02-18 中国电子科技集团公司第二十八研究所 Clutter map partitioning method based on image processing
CN104914429A (en) * 2015-05-19 2015-09-16 西安电子科技大学 Target indication radar system capable of adaptively selecting waveform according to target distance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
毕红葵等: "光栅扫描雷达显示系统关键技术问题解决方案", 《系统工程与电子技术》 *

Also Published As

Publication number Publication date
CN109444827B (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN106918807B (en) A kind of Targets Dots condensing method of radar return data
CN110988818B (en) Cheating interference template generation method for countermeasure network based on condition generation formula
CN107301661A (en) High-resolution remote sensing image method for registering based on edge point feature
CN106324597B (en) The translational compensation and imaging method of big corner ISAR radar based on PFA
CN108919249B (en) Radar target distance joint estimation method based on two-dimensional local interpolation
CN109116352B (en) Circular scanning ISAR mode ship super-resolution imaging method
CN108961255B (en) Sea-land noise scene segmentation method based on phase linearity and power
CN108562879A (en) Shipborne radar CFAR detection method based on FPGA
CN110046619A (en) The full-automatic shoal of fish detection method of unmanned fish finding ship and system, unmanned fish finding ship and storage medium
CN111781595A (en) Complex maneuvering group target imaging method based on matching search and Doppler ambiguity resolution
CN110929598B (en) Unmanned aerial vehicle-mounted SAR image matching method based on contour features
CN113837924B (en) Water shoreline detection method based on unmanned ship sensing system
CN113466863B (en) SAR ship target resolution imaging method
CN109444827A (en) A kind of orientation interpolation method shown for identification by radar echoes
CN112363144B (en) Ring-scan radar distance ambiguity and azimuth ambiguity identification method
CN108594196B (en) Method and device for extracting parameters of target scattering center
CN109345583B (en) SAR target image geometric dimension estimation method based on OMP
CN115407282B (en) SAR active deception jamming detection method based on interference phase under short base line
CN116559905A (en) Undistorted three-dimensional image reconstruction method for moving target of bistatic SAR sea surface ship
CN116542885A (en) Infrared ship trail image enhancement method
CN115601278A (en) High-precision motion error compensation method based on sub-image registration
CN105551013B (en) SAR image sequence method for registering based on motion platform parameter
CN105223571B (en) The ISAR imaging method significantly paid attention to based on weighting L1 optimization with vision
CN113406634A (en) Time domain phase matching-based ISAR (inverse synthetic aperture radar) three-dimensional imaging method for space high-speed spinning target
He et al. Research on Solid Rate Filtering Technique based on Inverse Distance Weighted Interpolation of Navigation Radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant