CN112083418A - Moving target joint pre-detection tracking method of video synthetic aperture radar - Google Patents

Moving target joint pre-detection tracking method of video synthetic aperture radar Download PDF

Info

Publication number
CN112083418A
CN112083418A CN202011068438.5A CN202011068438A CN112083418A CN 112083418 A CN112083418 A CN 112083418A CN 202011068438 A CN202011068438 A CN 202011068438A CN 112083418 A CN112083418 A CN 112083418A
Authority
CN
China
Prior art keywords
state
image
candidate
frame
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011068438.5A
Other languages
Chinese (zh)
Other versions
CN112083418B (en
Inventor
丁金闪
秦思琪
徐众
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202011068438.5A priority Critical patent/CN112083418B/en
Publication of CN112083418A publication Critical patent/CN112083418A/en
Application granted granted Critical
Publication of CN112083418B publication Critical patent/CN112083418B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9004SAR image acquisition techniques
    • G01S13/9005SAR image acquisition techniques with optical processing of the SAR signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a tracking method before joint detection of a moving target of a video synthetic aperture radar, which mainly solves the problems of large measurement and poor detection performance of the moving target of the traditional tracking method before detection. The scheme is as follows: dividing continuous multi-frame radar observation data into image data and range-Doppler data, carrying out primary detection on the first two frames of images in the image data, and initializing a track head to form a candidate area; all state points in the previous frame of candidate area are subjected to state transition to obtain a candidate area of a next frame of target, screening the candidate area is firstly carried out by combining the image and the range-Doppler data, and then, the value function and the backtracking function are calculated; and after all frame data are processed, judging a target through comparison of a value function and a threshold value, and obtaining a predicted track of the target through a backtracking function. The method effectively reduces the calculated amount of the tracking method before detection on the premise of ensuring low false alarm and high detection rate of the maneuvering target, and can be used for the high-efficiency detection of the radar system on the shadow of the maneuvering target.

Description

Moving target joint pre-detection tracking method of video synthetic aperture radar
Technical Field
The invention belongs to the technical field of radar signal processing, and particularly relates to a moving target joint detection method which can be used for shadow detection of a moving target by a video synthetic aperture radar.
Background
The video synthetic aperture radar (ViSAR) is a high frame rate imaging system working in a high frequency band. The video synthetic aperture radar usually adopts a beam-bunching imaging mode, obtains a high frame rate image sequence through an overlapping aperture processing technology to realize real-time observation of an imaging area, and obtains a range-doppler spectrum after processing echoes of each frame of image. In the synthetic aperture time, on one hand, due to the shielding of the target on the ground area, and on the other hand, due to the Doppler translation generated by the relative motion between the target and the radar, the moving target leaves a shadow at the actual position. Based on the characteristics, the method for detecting the moving target by using the dynamic shadow in the video synthetic aperture radar image becomes a new path for detecting the SAR moving target. The general method for detecting the target by utilizing the moving target shadow of the video synthetic aperture radar image is to eliminate background clutter through an image processing technology and then to carry out threshold-crossing judgment on each frame of data to be detected by adopting a constant false alarm technology so as to realize the detection of the target. The above procedure is generally called a detection-before-tracking technique, and its detection performance is greatly limited in a low signal-to-noise-ratio environment.
The tracking-before-detection TBD technology is a new detection technology aiming at a weak target. Compared with the traditional tracking technology after detection, the method has the advantages that the detection and tracking performance of the radar on the weak target is obviously improved by performing energy accumulation and combined processing on multi-frame continuous data and simultaneously giving out the detection and tracking results of the target. Therefore, the TBD technology is very suitable for a moving target shadow detection task of the video synthetic aperture radar, and the track-before-detection technology based on the dynamic programming strategy DP has good performance and wide application range. However, the current research on detecting the shadow of a moving target by a video synthetic aperture radar based on TBD is still incomplete, only image data is used for detection, speed information in a range-doppler spectrum is not used, so that resource waste is caused, the performance of detecting a target with strong mobility is poor, and the computation amount of the TBD technology is increased rapidly along with the increase of the number of frames, so that the computation complexity is too high. Therefore, a new TBD technology is urgently needed to realize efficient detection of the ViSAR moving target shadow.
Zhang et al, in the article "A novel approach to moving targets shadow detection in image sequence", propose a classic shadow detection method based on image processing technology, which includes image registration, speckle noise suppression, background extraction, differential processing, morphological processing, connected domain detection, and the like. The background is extracted by using multiframe information of a video synthetic aperture radar image, and then a shadow target is detected on the image with the background removed by using morphological processing. The method is computationally intensive and performance degrades severely with increasing noise.
Tian et al, in the paper, "Simultaneous Detection and Tracking of Moving-Target Shadows in VisAR image", proposes a Moving Target shadow Detection method based on a Tracking-before-Detection technique, which adopts a first-expansion and then-contraction strategy, expands the state of each frame with Gaussian distribution, contracts the state after state transition with a pixel value likelihood ratio as a basis, retains fewer large-weight particles, selects a track with the largest value-taking function as a Target track after state transition of the last frame, and completes Detection of each frame of the Target while giving the track. The method improves the distribution of the state candidate regions in the traditional TBD algorithm, enables the state candidate regions to better cover the target, and improves the detection performance of the target through multi-frame echo data accumulation. However, this method has two disadvantages, one is that the calculation amount is rapidly expanded with the increase of the frame number, and the other is that the detection performance for the strong maneuvering target is not good only by using the image data without using the range-doppler spectrum.
Disclosure of Invention
The invention aims to provide a moving target joint pre-detection tracking method of a video synthetic aperture radar to overcome the defects of the prior art, so as to reduce the calculated amount, reduce the false alarm rate of the moving target and improve the detection precision.
The technical scheme of the invention is that on a track-before-detect method based on a dynamic programming strategy, the initial state of a candidate target is automatically extracted through preprocessing, information in an image and a Doppler spectrum is jointly utilized, and a track-before-detect technology based on the dynamic programming strategy is adopted to carry out shadow detection on a moving target in a video synthetic aperture radar, and the method comprises the following implementation steps:
1. a tracking method before video synthetic aperture radar moving target joint detection is characterized by comprising the following steps:
(1) dividing original echo signals of the video synthetic aperture radar into R sub-apertures, wherein each sub-aperture comprises N pulses, and acquiring a high-resolution video synthetic aperture radar image I ═ { I ═ of each sub-aperture1,i2,...,ij,...,iRAnd the corresponding low resolution range-doppler spectrum D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j ═ 1, 2.., R;
(2) first two frames of image { i) for video synthetic aperture radar1,i2Sequentially carrying out image registration and double-parameter constant false alarm rate detection to obtain a set P of all preliminary detection points in the first two frames1And P2
(3) Set P of preliminary detection points by distance matching1And P2All the candidate track heads of the first frame are determined, and the set of all the candidate track heads is recorded as
Figure BDA0002714588580000021
Wherein
Figure BDA0002714588580000022
Representing the ith candidate track head of the 1 st frame, wherein i is 1,2, and M is the total number of the candidate track heads;
(4) initializing a value function of the ith target frame 1 to
Figure BDA0002714588580000023
The backtracking function is
Figure BDA0002714588580000024
The image candidate area is
Figure BDA0002714588580000025
(5) Let the current frame be the kth frame k ≧ 2, in the kth-1 frame, letThe image candidate area of the ith target is
Figure BDA0002714588580000031
Performing state transition on all state points in the candidate area of the previous frame to form an image candidate area of the ith target in the kth frame
Figure BDA0002714588580000032
(6) Image candidate area of ith target of kth frame
Figure BDA0002714588580000033
Mapped to the range-Doppler spectrum, formed in the range-Doppler spectrum dkCandidate region in (1)
Figure BDA0002714588580000034
(6a) From image candidate regions
Figure BDA0002714588580000035
In any one of the state points
Figure BDA0002714588580000036
Obtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T
(6b) According to the true state [ x' vx' y' vy']TObtaining state points with the radar flight parameters
Figure BDA0002714588580000037
Corresponding points in the range-doppler spectrum
Figure BDA0002714588580000038
(6c) Repeating (6a) to (6b) until the image candidate region is traversed
Figure BDA0002714588580000039
Form a corresponding Doppler spectrum dkCandidate region of
Figure BDA00027145885800000310
(7) And (3) screening the state points:
(7a) for range-Doppler spectrum candidate region
Figure BDA00027145885800000311
The sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate region
Figure BDA00027145885800000312
And according to the corresponding relation between the image and the Doppler spectrum, selecting the range-Doppler spectrum candidate area
Figure BDA00027145885800000313
Obtaining new image candidate area
Figure BDA00027145885800000314
(7b) Setting a minimum amplitude threshold TminNew candidate regions of the image
Figure BDA00027145885800000315
All the state points with the middle amplitude smaller than the threshold are deleted to form the final image candidate area
Figure BDA00027145885800000316
Then the final image candidate area is used
Figure BDA00027145885800000317
Assigning to the image candidate area of the ith target in the kth frame
Figure BDA00027145885800000318
Screening of state points in the image is achieved;
(8) the filtered ones are obtained by accumulation and search respectively
Figure BDA00027145885800000319
All state points inValue function of
Figure BDA00027145885800000320
And a backtracking function
Figure BDA00027145885800000321
Wherein
Figure BDA00027145885800000322
Indicating the kth frame, the num state of the ith target,
Figure BDA00027145885800000323
Figure BDA00027145885800000324
is composed of
Figure BDA00027145885800000325
The total number of intermediate state points;
(9) repeating (5) to (8) until k is equal to R, and taking the maximum value function in the R-th frame
Figure BDA00027145885800000326
With a set threshold VTAnd (3) comparison:
Figure BDA00027145885800000327
the false alarm is considered as the initial detection false alarm, no target is found, otherwise, the target is considered as the target, and the backtracking function is completed by updating
Figure BDA0002714588580000041
Continuously backtracking the state of each frame to obtain the predicted track of the ith target in total R frames
Figure BDA0002714588580000042
Wherein
Figure BDA0002714588580000043
A jth frame prediction state representing an ith target;
(10) and (5) repeating the steps (4) to (9) in sequence until all the candidate targets are traversed to obtain the candidate tracks of all the targets, and completing the detection of the targets.
Compared with the prior art, the invention has the following advantages:
1) the invention greatly improves the detection performance of the weak target by accumulating the energy of multi-frame continuous data and gets rid of the limitation of the traditional single-frame detection because the pre-detection tracking algorithm is applied to the shadow detection of the moving target of the video SAR.
2) According to the invention, the initial state of the candidate target is automatically extracted by sequentially carrying out image registration, double-parameter constant false alarm detection and distance matching on the first two frames of images, so that the initial detection false alarm rate is reduced, the detection efficiency is improved, and the whole tracking process is more automatic and intelligent.
3) According to the invention, because the position information of the target in the image and the speed information of the target in the range Doppler spectrum are combined, on one hand, candidate points with unmatched speed and incorrect positions are deleted, the algorithm running time is greatly reduced, on the other hand, more state points conforming to the target motion rule are reserved, and the shadow detection tracking performance of the maneuvering target is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a usage scenario diagram of the present invention;
FIG. 3 is a diagram of the actual imaging geometry in the present invention;
fig. 4 is a diagram of simulation results of moving target detection in video synthetic aperture radar data using the present invention.
Detailed Description
Embodiments and effects of the present invention will be further described below with reference to the accompanying drawings.
Referring to fig. 1, the implementation steps of the present invention are as follows:
step one, obtaining a high-resolution video synthetic aperture radar image and a corresponding low-resolution range-Doppler spectrum.
1.1) setting the complex echo signal of the video synthetic aperture radar as an echo matrix of J range gates and Z pulses, and dividing the complex echo signal by using an overlapped sliding window with the length of N and the step length of SIs divided into R sub-apertures, wherein
Figure BDA0002714588580000044
Representing a rounding-down operation, S < N;
1.2) imaging N pulses in each sub-aperture by adopting a polar coordinate format imaging algorithm to obtain a high-resolution video synthetic aperture radar image, extracting W continuous pulses from the center of each sub-aperture, and sequentially performing range compression and azimuth Fourier transform to obtain a corresponding low-resolution range Doppler spectrum;
1.3) processing the echo data of each sub-aperture according to the mode 1.2), and finally obtaining R frame image data I ═ { I ═ I1,i2,...,ij,...,iRD and R frame range doppler data D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j 1, 2.
Step two, two previous frame images { i1,i2On the point, a set P of all the preliminary detection points is obtained1And P2
2.1) will be two previous frame images i1,i2Using adjacent frames of images as reference images, respectively registering the adjacent frames of images with the reference images, and carrying out difference on the images after the registration of the frames of images and the reference images to realize moving target extraction so as to obtain two preprocessed frames of images { i }1',i2'};
2.2) setting the false alarm rate of the preliminary detection to be faCalculating the threshold T of the double-parameter constant false alarm detectorhComprises the following steps:
Figure BDA0002714588580000051
2.3) setting the first frame image i after the preprocessing1The point to be detected is p ═ 1, (x, y), Nx,y=1,...,Ny,NxAnd NyAre respectively i1Two dimensions of'; setting the detection area of the point p to be detected as theta, and calculating the mean value mu and standard in the detection area thetaThe tolerance σ is:
Figure BDA0002714588580000052
Figure BDA0002714588580000053
wherein p isiAs a point in the detection zone Θ,
Figure BDA0002714588580000054
total number of all points in the detection area, i1'(pi) Represents a point piIn the first frame image i after preprocessing1A pixel value on';
2.4) calculating the detection quantity T of the point p to be detected according to the calculation result in 2.3):
Figure BDA0002714588580000055
2.5) comparing the detected quantity T with a threshold ThComparing and judging whether the point p to be detected has a target or not; if T is greater than or equal to ThIf not, judging that the target exists in the detection point;
2.6) traversing the preprocessed first frame image i1' all detection points in the sequence obtain a first frame image i1After morphological closing operation, opening operation and connected domain processing are sequentially carried out on the constant false alarm detection result, the centroid coordinate of each connected domain is extracted to obtain a set P of first frame preliminary detection points1={(xi,yi),i=1,...,U1In which U is1Is P1Detecting the total number of the points;
2.7) in the second frame image i2Repeat 2.2) to 2.6) above to obtain a set P of second frame preliminary detection points2={(xj,yj),j=1,...,U2In which U is2Is P2The total number of detection points is detected.
Step three, using distance matching method to obtain the set P of preliminary detection points from the first frame1And a set of preliminary detection points P for a second frame2A set C of all candidate track heads of the first frame is determined.
3.1) setting the distance range of two adjacent frames of the moving target to be 0, Lmax]Set of candidate flight path heads
Figure BDA0002714588580000061
Wherein L ismaxThe maximum distance between two adjacent frames of the moving target is obtained;
3.2) calculating the first frame preliminary detection point set P1At a certain point (x)i,yi) And a second frame preliminary detection point set P2Neutral (x)i,yi) Closest point (x)near,ynear) Distance L of (a):
Figure BDA0002714588580000062
3.3) judging whether the distance L is [0, Lmax]Within the range: if not, the point (x) is addedi,yi) Discard as false alarm point, if it is, calculate point (x)i,yi) Velocity (v) ofxi.vyi):
Figure BDA0002714588580000063
Where t is the time difference between two adjacent frames, vxiIs a point (x)i,yi) Transverse velocity of vyiIs a point (x)i,yi) Longitudinal speed of (d);
3.4) initial state of the object [ x ]i vxi yi vyi]TAdding to a candidate track head C, wherein T represents transposition;
3.5) repeat 3.2) to 3.4) until P1All the points in the navigation path are traversed to obtain the final candidate flight path head
Figure BDA0002714588580000064
The initial state of the ith candidate target in the first frame is represented, and i is 1, 2.
Step four, initializing the value function of the ith target frame 1
Figure BDA0002714588580000065
Backtracking function
Figure BDA0002714588580000066
Image candidate region
Figure BDA0002714588580000067
4.1) setting the initial state of the ith candidate target in the first frame
Figure BDA0002714588580000068
Then value function
Figure BDA0002714588580000069
Comprises the following steps:
Figure BDA00027145885800000610
wherein the content of the first and second substances,
Figure BDA0002714588580000071
are respectively in an initial state
Figure BDA0002714588580000072
The abscissa, the lateral velocity, the ordinate, the longitudinal velocity,
Figure BDA0002714588580000073
indicating points
Figure BDA0002714588580000074
In the first frame image i1The value of the pixel of (a) above,
Figure BDA0002714588580000075
indicating a state
Figure BDA0002714588580000076
Wherein n is 1,2,3, 4;
4.2) initializing the backtracking function
Figure BDA0002714588580000077
4.3) initializing image candidate regions
Figure BDA0002714588580000078
Step five, the image candidate area in the previous frame is processed
Figure BDA0002714588580000079
All the state points are subjected to state transition to form an image candidate area of the ith target in the k frames
Figure BDA00027145885800000710
5.1) image candidate regions
Figure BDA00027145885800000711
Expressed as:
Figure BDA00027145885800000712
Figure BDA00027145885800000713
wherein
Figure BDA00027145885800000714
In the form of a discrete state space,
Figure BDA00027145885800000715
indicating the num state of the ith target in the k-1 th frame,
Figure BDA00027145885800000716
representing the total number of states of the ith object of the (k-1) th frame,
Figure BDA00027145885800000717
respectively represent states
Figure BDA00027145885800000718
In the k-1 th frame ik-1The transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
5.2) adopting a uniform speed transfer model to carry out image candidate area
Figure BDA00027145885800000719
Any one of the state points
Figure BDA00027145885800000720
Carrying out state transition to obtain the num state of the ith target in the kth frame
Figure BDA00027145885800000721
Figure BDA00027145885800000722
Figure BDA00027145885800000723
x∈{x1,x2,…,xn}
y∈{y1,y2,…,ym}
Wherein, I2Is a matrix of a second-order unit,
Figure BDA00027145885800000724
representing the Kronecker product of the matrix,xandyacceleration process noise in x-and y-directions, respectively, the x-direction acceleration process noise being collectedx1,x2,…,xnMeans forxCan selectRange of (1), y-direction acceleration process noise sety1,y2,…,ymMeans forySelectable range, n representing the setx1,x2,…,xnTotal number of elements in the gallery, m representing a collectiony1,y2,…,ymThe total number of elements in the transpose, T, denotes transpose.
5.3) Accelerate the course noise set from x-x1,x2,…,xnTaking any element as x-direction acceleration process noisexValue of (c), acceleration process noise set from yy1,y2,…,ymTaking any element as y-direction acceleration process noiseyUntil all the combination modes of the noise of the acceleration process are traversed, namely the value of (1) can be at the state point
Figure BDA0002714588580000081
After the state transition, generating n × m different state transition points, which are formed in the ith sub-candidate region of the ith target in the kth frame
Figure BDA0002714588580000082
5.4) repeat 5.2) to 5.3) until the image candidate area is traversed
Figure BDA0002714588580000083
All the state points after state transition form the image candidate area of the ith target of the kth frame
Figure BDA0002714588580000084
Figure BDA0002714588580000085
Figure BDA0002714588580000086
Wherein the content of the first and second substances,
Figure BDA0002714588580000087
respectively represent state points
Figure BDA0002714588580000088
In the k frame ikHorizontal coordinate, horizontal speed, vertical coordinate, vertical speed;
Figure BDA0002714588580000089
the number of all states;
Figure BDA00027145885800000810
the η -th sub-candidate representing the ith target of the kth frame,
Figure BDA00027145885800000811
t denotes transposition for all the sub candidate areas.
Step six, candidate areas in the image are obtained
Figure BDA00027145885800000812
Mapping into range-Doppler spectrum to form candidate regions in range-Doppler spectrum
Figure BDA00027145885800000813
6.1) from the image candidate region
Figure BDA00027145885800000814
In any one of the state points
Figure BDA00027145885800000815
Obtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T
Referring to fig. 2, the area ABCD represents an image ikThe area EFGH represents a scene image, and this step is implemented as follows,
6.1.1) from candidate regions
Figure BDA00027145885800000816
Zhong renGet a state point
Figure BDA00027145885800000817
The position of which corresponds to point P in FIG. 2, the abscissa of which is x0Ordinate is y0
Figure BDA00027145885800000818
Wherein the content of the first and second substances,
Figure BDA00027145885800000819
is the abscissa of the state point and is,
Figure BDA00027145885800000820
is the ordinate of the state point and,
Figure BDA00027145885800000821
indicating a state
Figure BDA00027145885800000822
The first element of (a) is,
Figure BDA00027145885800000823
indicating a state
Figure BDA00027145885800000824
The third element of (1);
6.1.2) coordinates (x) of point P by geometric relationship0,y0) Transformation to coordinates (x) in corresponding scene images1,y1):
Figure BDA0002714588580000091
Wherein, theta is the rotation angle of the radar platform relative to the scene at the moment, N represents the number of pulses, J represents the number of range gates, and alpha is x0-N/2,β=y0-J/2,x1Indicates the state point
Figure BDA0002714588580000092
Abscissa, y, in the scene image1Representing the ordinate of the state point in the scene image;
6.1.3) scaling the coordinates (x) by scaling1,y1) The actual position coordinates (x ', y') converted into the stationing coordinate system are:
Figure BDA0002714588580000093
where ρ isrRepresenting range-wise resolution, ρ, of a video synthetic aperture radar systemαIndicating the azimuth resolution of the video synthetic aperture radar system, x' indicating the state
Figure BDA0002714588580000094
Actual abscissa in the dotted coordinate system, y' denotes the state
Figure BDA0002714588580000095
An actual ordinate in a stationing coordinate system;
6.1.4) pairs of State points
Figure BDA0002714588580000096
Performing state transition to obtain the state of the next frame of target, converting to the actual position coordinate in the stationing coordinate system according to 6.1.1) to 6.1.3), and obtaining the state point according to the position difference of the two points and the time interval of the two frames
Figure BDA0002714588580000097
Actual velocity v ofx',vy', reuse x ', y ', vx',vy'composition true State [ x' vx' y' vy']T
6.2) according to the true state [ x' v ]x' y' vy']TObtaining state points with the radar flight parameters
Figure BDA0002714588580000098
At a large distanceCorresponding points in the plerian spectrum
Figure BDA0002714588580000099
Referring to fig. 3, this step is implemented as follows:
6.2.1) the radar moves from point A to point B at a constant speed in a circular manner to generate an image ikThe radar position coordinate of time is (x)R,yR,zR) At a speed of
Figure BDA00027145885800000910
Setting the coordinate vector of the K point in the actual scene corresponding to the target as (x ', y',0) and the velocity vector as
Figure BDA00027145885800000911
Calculating the normalized slope distance vector of the connecting line of the target and the radar
Figure BDA00027145885800000912
Comprises the following steps:
Figure BDA00027145885800000913
wherein, | - | represents the modulus of the vector;
6.2.2) according to the slant range vector
Figure BDA00027145885800000914
Speed of rotation
Figure BDA00027145885800000915
And velocity
Figure BDA00027145885800000916
Calculating the radial velocity v of the moving target K relative to the radarRK
Figure BDA0002714588580000101
Wherein · represents a dot product;
6.2.3) according to the radial directionVelocity vRKCalculating the Doppler frequency f of the moving target K relative to the radarDKComprises the following steps:
Figure BDA0002714588580000102
wherein λ represents the wavelength of the radar-transmitted signal;
6.2.4) obtaining the state points according to the calculation results in 6.2.3)
Figure BDA0002714588580000103
In the Doppler spectrum dkPoint corresponding to
Figure BDA0002714588580000104
Abscissa of
Figure BDA0002714588580000105
And the ordinate
Figure BDA0002714588580000106
Comprises the following steps:
Figure BDA0002714588580000107
Figure BDA0002714588580000108
wherein the content of the first and second substances,
Figure BDA0002714588580000109
indicating a state
Figure BDA00027145885800001010
The third element of (1).
6.3) repeating steps 6.1.1) to 6.2.4) until the candidate region is traversed
Figure BDA00027145885800001011
After traversing, all the state points form corresponding state pointsTaylor spectrum dkCandidate region of
Figure BDA00027145885800001012
Namely:
Figure BDA00027145885800001013
Figure BDA00027145885800001014
wherein
Figure BDA00027145885800001015
Representing a discrete doppler space of a doppler space,
Figure BDA00027145885800001016
respectively representing points
Figure BDA00027145885800001017
The lateral coordinate and the longitudinal coordinate on the doppler spectrum,
Figure BDA00027145885800001018
representing candidate regions
Figure BDA00027145885800001019
The total number of upper states.
And step seven, screening candidate points.
7.1) candidate regions of the range-Doppler spectra
Figure BDA00027145885800001020
The sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate region
Figure BDA00027145885800001021
And corresponding new image candidate regions
Figure BDA00027145885800001022
7.1.1) image candidate area according to the corresponding relation of the step six
Figure BDA00027145885800001023
Each sub-candidate region in (1)
Figure BDA00027145885800001024
In the Doppler spectrum candidate region
Figure BDA00027145885800001025
The sub-candidate region corresponding to the above is
Figure BDA00027145885800001026
Wherein the range-Doppler spectrum candidate region
Figure BDA00027145885800001027
Expressed as:
Figure BDA00027145885800001028
for range-Doppler spectrum candidate
Figure BDA00027145885800001029
The nth sub-candidate region of (a),
Figure BDA00027145885800001030
is a sub-candidate region
Figure BDA00027145885800001031
The total number of (c);
7.1.2) optionally taking a sub-candidate region
Figure BDA0002714588580000111
Sub-candidate regions
Figure BDA0002714588580000112
According to which the candidate point in (1) is in the Doppler spectrum dkThe upper amplitude values are sorted from large to small, and the first Ns candidate points are taken to form a screened sub candidate area
Figure BDA0002714588580000113
Ns≥16;
7.1.3) repeating the step 7.1.2) until all the sub candidate areas are traversed to form a screened range-Doppler spectrum candidate area
Figure BDA0002714588580000114
Figure BDA0002714588580000115
7.1.4) obtaining a distance Doppler spectrum candidate area after screening according to the corresponding relation between the image and the Doppler spectrum
Figure BDA0002714588580000116
Corresponding new image candidate area
Figure BDA0002714588580000117
Finishing the screening of candidate points in the Doppler spectrum;
7.2) for new image candidate regions
Figure BDA0002714588580000118
Screening the candidate points in (1): setting the amplitude threshold as TminWill select the area after
Figure BDA0002714588580000119
All the pixel values are less than TminAll candidate points are deleted to obtain an image candidate area after screening is finished
Figure BDA00027145885800001110
Step eight, calculating the image candidate area after the screening is finished
Figure BDA00027145885800001111
Value functions and backtracking functions of all state points in the set.
8.1) image candidate area after finishing screening
Figure BDA00027145885800001112
Expressed as:
Figure BDA00027145885800001113
Figure BDA00027145885800001114
wherein the content of the first and second substances,
Figure BDA00027145885800001115
in the form of a discrete state space,
Figure BDA00027145885800001116
indicates the total number of state points after screening,
Figure BDA00027145885800001117
indicating the kth frame, the num state of the ith target,
Figure BDA00027145885800001118
respectively represent state points
Figure BDA00027145885800001119
At the k frame image ikThe transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
8.2) taking the image candidate area after the screening
Figure BDA00027145885800001120
Any one of the state points
Figure BDA00027145885800001121
Obtaining the value function of the state point by an accumulation method
Figure BDA00027145885800001122
Obtaining the backtracking function of the state point by a searching method
Figure BDA00027145885800001123
Figure BDA00027145885800001124
Figure BDA00027145885800001125
Wherein
Figure BDA00027145885800001126
For image candidate regions
Figure BDA00027145885800001127
Can be transferred to a state
Figure BDA00027145885800001128
The set of state points of (a) is,
Figure BDA00027145885800001129
is shown in image ikIn the abscissa of
Figure BDA00027145885800001130
The ordinate is
Figure BDA00027145885800001131
The value of the pixel of (a) is,
Figure BDA00027145885800001132
indicates a state of
Figure BDA00027145885800001133
A value function of;
8.3) repeating 8.2) until the image candidate area after the screening is traversed
Figure BDA0002714588580000121
And obtaining the value function and the backtracking function of all the state points at each state point.
And step nine, judging whether the target exists or not through the maximum value in the value function.
9.1) setting a detection threshold VT
9.2) repeating the fifth step to the eighth step in turn until k is equal to R, and taking the function of the value of the R-th frame
Figure BDA0002714588580000122
The median maximum is recorded as
Figure BDA0002714588580000123
Compares it with a detection threshold VTFor comparison, if
Figure BDA0002714588580000124
If so, executing step ten, otherwise, jumping to step eleven without the target:
step ten, obtaining the predicted track of the ith target common R frame through a backtracking function
Figure BDA0002714588580000125
10.1) predicted State of the ith target, when target is detected, the ith target, the Rth frame
Figure BDA0002714588580000126
The state point with the largest value function in the R-th frame is represented, and the prediction states of the rest frames can be obtained through a backtracking function, and are represented as follows:
Figure BDA0002714588580000127
wherein
Figure BDA0002714588580000128
Representing the prediction state of the ith target kth frame;
10.2) repeating the step 10.1) until k is equal to 1, and obtaining a predicted track of the ith target R-frame
Figure BDA0002714588580000129
And step eleven, sequentially repeating the step four to the step ten until all the candidate targets are traversed to obtain the candidate tracks of all the targets.
The effect of the invention can be further illustrated by the following simulation:
the simulation parameters are as shown in the table 1:
TABLE 1 simulation parameters
Figure BDA00027145885800001210
Figure BDA0002714588580000131
Secondly, simulation content:
the simulation parameters shown in table 1 are used to simulate the original echo of the video synthetic aperture radar, the original echo of the radar is imaged in a polar coordinate format to obtain a high-resolution video synthetic aperture radar image and a corresponding low-resolution range-doppler spectrum, the moving target shadow in the high-resolution video synthetic aperture radar image is detected by using the joint pre-detection tracking method provided by the invention, and the result is shown in fig. 4, wherein:
fig. 4(a) shows an original image, in which 5 moving target shadows left by moving targets exist, that is, a target 1 shows a leftward curvilinear motion, a target 2 shows a rightward curvilinear motion, a target 3 shows a linear motion that is accelerated and then decelerated, a target 4 shows a linear motion that is decelerated and then accelerated, and a target 5 shows a uniform linear motion throughout the entire range;
FIG. 4(b) is a column image showing the shadow detection result of a moving object by the method of the present invention;
figure 4(c) is a column image showing the results of detection in the corresponding doppler spectra using the method of the present invention.
As can be seen from the images in fig. 4(b), the method of the present invention realizes accurate tracking and detection of the shadows of 5 strong maneuvering targets, and the positions of the shadows of the 5 strong maneuvering targets are marked by boxes, and no false alarm target appears.
As can be seen from the images in fig. 4(c), the accurate coverage of the positions of 5 strong maneuvering targets in the doppler spectrum is achieved by the method of the present invention.
The foregoing description is only a specific example of the present invention and is not intended to limit the invention, so that it will be apparent to those skilled in the art that various changes and modifications in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (8)

1. A tracking method before video synthetic aperture radar moving target joint detection is characterized by comprising the following steps:
(1) dividing original echo signals of the video synthetic aperture radar into R sub-apertures, wherein each sub-aperture comprises N pulses, and acquiring a high-resolution video synthetic aperture radar image I ═ { I ═ of each sub-aperture1,i2,...,ij,...,iRAnd the corresponding low resolution range-doppler spectrum D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j ═ 1, 2.., R;
(2) first two frames of image { i) for video synthetic aperture radar1,i2Sequentially carrying out image registration and double-parameter constant false alarm rate detection to obtain a set P of all preliminary detection points in the first two frames1And P2
(3) Set P of preliminary detection points by distance matching1And P2All the candidate track heads of the first frame are determined, and the set of all the candidate track heads is recorded as
Figure FDA0002714588570000011
Wherein
Figure FDA0002714588570000012
Represents the ith candidate track head of the 1 st frame, i is 1,2The total number;
(4) initializing a value function of the ith target frame 1 to
Figure FDA0002714588570000013
The backtracking function is
Figure FDA0002714588570000014
The image candidate area is
Figure FDA0002714588570000015
(5) Let the current frame be the kth frame k ≧ 2, and in the kth-1 frame, let the image candidate region of the ith target be
Figure FDA0002714588570000016
Performing state transition on all state points in the candidate area of the previous frame to form an image candidate area of the ith target in the kth frame
Figure FDA0002714588570000017
(6) Image candidate area of ith target of kth frame
Figure FDA0002714588570000018
Mapped to the range-Doppler spectrum, formed in the range-Doppler spectrum dkCandidate region in (1)
Figure FDA0002714588570000019
(6a) From image candidate regions
Figure FDA00027145885700000110
In any one of the state points
Figure FDA00027145885700000111
Obtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T
(6b) According to the true state [ x' vx' y' vy']TObtaining state points with the radar flight parameters
Figure FDA00027145885700000112
Corresponding points in the range-doppler spectrum
Figure FDA00027145885700000113
(6c) Repeating (6a) to (6b) until the image candidate region is traversed
Figure FDA00027145885700000114
Form a corresponding Doppler spectrum dkCandidate region of
Figure FDA00027145885700000115
(7) And (3) screening the state points:
(7a) for range-Doppler spectrum candidate region
Figure FDA0002714588570000021
The sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate region
Figure FDA0002714588570000022
And according to the corresponding relation between the image and the Doppler spectrum, selecting the range-Doppler spectrum candidate area
Figure FDA0002714588570000023
Obtaining new image candidate area
Figure FDA0002714588570000024
(7b) Setting a minimum amplitude threshold TminNew candidate regions of the image
Figure FDA0002714588570000025
All the state points with the middle amplitude smaller than the threshold are deleted to form the final image candidate area
Figure FDA0002714588570000026
Then the final image candidate area is used
Figure FDA0002714588570000027
Assigning to the image candidate area of the ith target in the kth frame
Figure FDA0002714588570000028
Screening of state points in the image is achieved;
(8) the filtered ones are obtained by accumulation and search respectively
Figure FDA0002714588570000029
Value function of all state points in
Figure FDA00027145885700000210
And a backtracking function
Figure FDA00027145885700000211
Wherein
Figure FDA00027145885700000212
Indicating the kth frame, the num state of the ith target,
Figure FDA00027145885700000213
is composed of
Figure FDA00027145885700000214
The total number of intermediate state points;
(9) repeating (5) to (8) until k is equal to R, and taking the maximum value function in the R-th frame
Figure FDA00027145885700000215
With a set threshold VTAnd (3) comparison:
Figure FDA00027145885700000216
the false alarm is considered as the initial detection false alarm, no target is found, otherwise, the target is considered as the target, and the backtracking function is completed by updating
Figure FDA00027145885700000217
Continuously backtracking the state of each frame to obtain the predicted track of the ith target in total R frames
Figure FDA00027145885700000218
Wherein
Figure FDA00027145885700000219
A jth frame prediction state representing an ith target;
(10) and (5) repeating the steps (4) to (9) in sequence until all the candidate targets are traversed to obtain the candidate tracks of all the targets, and completing the detection of the targets.
2. The method of claim 1, wherein the obtaining of the high resolution video synthetic aperture radar image and the corresponding low resolution range-doppler spectrum for each sub-aperture in (1) is performed as follows:
1a) setting a complex echo signal of a video synthetic aperture radar as an echo matrix of J range gates and Z pulses, and dividing the complex echo signal into R sub-apertures by using an overlapped sliding window with the length of N and the step length of S, wherein
Figure FDA00027145885700000220
Figure FDA00027145885700000221
Representing a rounding-down operation, S < N;
1b) imaging N pulses in each sub-aperture by adopting a polar coordinate format imaging algorithm to obtain a high-resolution video synthetic aperture radar image, extracting W continuous pulses from the center of each sub-aperture, and sequentially performing range-direction compression and azimuth-direction Fourier transform to obtain a corresponding low-resolution range Doppler spectrum;
1c) processing the echo data of each sub-aperture according to the mode 1b), and finally obtaining R frame image data I ═ { I ═1,i2,...,ij,...,iRD and R frame range doppler data D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j 1, 2.
3. The method according to claim 1, wherein in (3) the set P of preliminary detection points is distance-matched from the set of preliminary detection points1And P2Determining all candidate track heads of the first frame, which is implemented as follows:
3a) setting the interval range of two adjacent frames of the moving target as 0, Lmax]Set of candidate flight path heads
Figure FDA0002714588570000031
Wherein L ismaxThe maximum distance between two adjacent frames of the moving target is obtained;
3b) calculating a first frame preliminary detection point set P1At a certain point (x)i,yi) And a second frame preliminary detection point set P2Neutral (x)i,yi) Closest point (x)near,ynear) Distance L of (a):
Figure FDA0002714588570000032
3c) judging whether the distance L is [0, L ]max]Within the range: if not, the point (x) is addedi,yi) Discard as false alarm point, if it is, calculate point (x)i,yi) Velocity (v) ofxi.vyi):
Figure FDA0002714588570000033
Where t is the time difference between two adjacent frames, vxiIs a point (x)i,yi) Transverse velocity of vyiIs a point (x)i,yi) Longitudinal speed of (d);
3d) initial state of target [ xi vxi yi vyi]TAdding to a candidate track head C, wherein T represents transposition;
3e) repeating 3b) to 3d) until P1All the points in the navigation path are traversed to obtain the final candidate flight path head
Figure FDA0002714588570000034
Figure FDA0002714588570000035
Denotes the initial state of the ith candidate object of the first frame, i ═ 1, 2.
4. The method of claim 1, wherein (5) the candidate region of the image in the previous frame is selected
Figure FDA0002714588570000036
All the state points are subjected to state transition to form an image candidate area of the ith target in the k frames
Figure FDA0002714588570000037
The implementation is as follows:
5a) candidate region of image
Figure FDA0002714588570000038
Expressed as:
Figure FDA0002714588570000041
Figure FDA0002714588570000042
wherein
Figure FDA0002714588570000043
In the form of a discrete state space,
Figure FDA0002714588570000044
indicating the num state point of the ith target in the k-1 th frame,
Figure FDA0002714588570000045
representing the total number of states of the ith object of the (k-1) th frame,
Figure FDA0002714588570000046
respectively represent states
Figure FDA0002714588570000047
In the k-1 th frame ik-1The transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
5b) adopting a uniform velocity transfer model to make the image candidate region
Figure FDA0002714588570000048
Any one of the state points
Figure FDA0002714588570000049
Performing a state transition to obtain
Figure FDA00027145885700000410
Figure FDA00027145885700000411
Figure FDA00027145885700000412
x∈{x1,x2,…,xn}
y∈{y1,y2,…,ym}
Wherein, I2Is a matrix of a second-order unit,
Figure FDA00027145885700000413
representing the Kronecker product of the matrix,xandythe noise of acceleration process in x-and y-directions, respectively, being collectedx1,x2,…,xnMeans forxSelectable range, sety1,y2,…,ymMeans foryA selectable range, T denotes transpose;
5c) repeat 5b) until traversal
Figure FDA00027145885700000414
All the state points after state transition form the image candidate area of the ith target of the kth frame
Figure FDA00027145885700000415
Figure FDA00027145885700000416
Wherein the content of the first and second substances,
Figure FDA00027145885700000417
respectively represent state points
Figure FDA00027145885700000418
In the k frame ikHorizontal coordinate, horizontal speed, vertical coordinate, vertical speed;
Figure FDA00027145885700000419
the number of all states;
Figure FDA00027145885700000420
the η -th sub-candidate representing the ith target of the kth frame,
Figure FDA00027145885700000421
t denotes transposition for all the sub candidate areas.
5. The method of claim 1, wherein (6a) the candidate regions are to be selected from images
Figure FDA00027145885700000422
Middle arbitrary state point
Figure FDA00027145885700000423
Making transfer to obtain real state x' vx' y' vy']TIt is implemented as follows:
6a1) setting a slave image candidate area
Figure FDA00027145885700000424
In any one of the state points
Figure FDA00027145885700000425
Coordinate (x) of0,y0) Expressed as:
Figure FDA0002714588570000051
wherein the content of the first and second substances,
Figure FDA0002714588570000052
is the abscissa of the state point and is,
Figure FDA0002714588570000053
is the ordinate of the state point and,
Figure FDA0002714588570000054
indicating a state
Figure FDA0002714588570000055
The first element of (a) is,
Figure FDA0002714588570000056
indicating a state
Figure FDA0002714588570000057
The third element of (1);
6a2) by geometric relationship, coordinate (x)0,y0) Transformation to coordinates (x) in corresponding scene images1,y1):
Figure FDA0002714588570000058
Wherein, theta is the rotation angle of the radar platform relative to the scene at the moment, N represents the number of pulses, J represents the number of range gates, and alpha is x0-N/2,β=y0-J/2,x1Indicates the state point
Figure FDA0002714588570000059
Abscissa, y, in the scene image1Representing the ordinate of the state point in the scene image;
6a3) by scaling, the coordinates (x)1,y1) The actual position coordinates (x ', y') converted into the stationing coordinate system are:
Figure FDA00027145885700000510
where ρ isrRepresenting range-wise resolution, ρ, of a video synthetic aperture radar systemαIndicating the azimuth resolution of the video synthetic aperture radar system, x' indicating the state
Figure FDA00027145885700000511
Actual abscissa in the dotted coordinate system, y' denotes the state
Figure FDA00027145885700000512
An actual ordinate in a stationing coordinate system;
6a4) for state point
Figure FDA00027145885700000513
Performing state transition to obtain the state of the next frame of target, converting to the actual position coordinate in the stationing coordinate system according to 6a1) to 6a3), and obtaining the state point according to the position difference of the two points and the time interval of the two frames
Figure FDA00027145885700000514
Actual velocity v ofx',vy', by x ', y ', vx',vy'composition true State [ x' vx' y' vy']T
6. The method of claim 1, wherein [ x' v ] is determined in (6b) according to true statex' y' vy']TObtaining state points with the radar flight parameters
Figure FDA00027145885700000515
Corresponding points in the range-doppler spectrum
Figure FDA00027145885700000516
The implementation is as follows:
6b1) generating image i by performing uniform-speed circular motion according to radarkPosition coordinates (x) of timeR,yR,zR) And velocity
Figure FDA00027145885700000517
Setting the coordinate vector of the K point in the actual scene corresponding to the target as (x ', y',0) and the velocity vector as
Figure FDA00027145885700000518
Calculating the normalized slope distance vector of the connecting line of the target and the radar
Figure FDA00027145885700000519
Comprises the following steps:
Figure FDA0002714588570000061
wherein, | - | represents the modulus of the vector;
6b2) according to the vector of the slant distance
Figure FDA0002714588570000062
Speed of rotation
Figure FDA0002714588570000063
And velocity
Figure FDA0002714588570000064
Calculating the radial velocity v of the moving target K relative to the radarRK
Figure FDA0002714588570000065
Wherein · represents a dot product;
6b3) according to radial velocity vRKCalculating the Doppler frequency f of the moving target K relative to the radarDKComprises the following steps:
Figure FDA0002714588570000066
wherein λ represents the wavelength of the radar-transmitted signal;
6b4) from the calculation result in 6b3), the state point is obtained
Figure FDA0002714588570000067
In the Doppler spectrum dkPoint corresponding to
Figure FDA0002714588570000068
Abscissa of
Figure FDA0002714588570000069
And the ordinate
Figure FDA00027145885700000610
Comprises the following steps:
Figure FDA00027145885700000611
Figure FDA00027145885700000612
wherein the content of the first and second substances,
Figure FDA00027145885700000613
indicating a state
Figure FDA00027145885700000614
The third element of (1).
7. The method of claim 1, wherein (7a) the range-doppler spectrum candidate regions are paired
Figure FDA00027145885700000615
The sub-candidate regions in (a) are screened, which is achieved as follows:
7a1) candidate region of range-Doppler spectrum
Figure FDA00027145885700000616
Expressed as:
Figure FDA00027145885700000617
wherein
Figure FDA00027145885700000618
Representing range-doppler spectrum candidate regions
Figure FDA00027145885700000619
The nth sub-candidate region of (a),
Figure FDA00027145885700000620
representing sub-candidate regions
Figure FDA00027145885700000621
The total number of (c);
7a2) arbitrarily choose a sub-candidate region
Figure FDA00027145885700000622
Sub-candidate regions
Figure FDA00027145885700000623
According to which the candidate point in (1) is in the Doppler spectrum dkThe upper amplitude values are sorted from large to small, and the first Ns candidate points are taken to form a screened sub candidate area
Figure FDA00027145885700000624
7a3) Repeating the step 7a2) until all the sub candidate regions are traversed to form a filtered range-Doppler spectrum candidate region
Figure FDA00027145885700000625
Figure FDA0002714588570000071
8. The method according to claim 1, wherein the candidate regions of the filtered image are obtained by accumulating and searching in (8)
Figure FDA0002714588570000072
Value function sum of all state points inA backtracking function implemented as follows:
8a) candidate regions of the screened image
Figure FDA0002714588570000073
Expressed as:
Figure FDA0002714588570000074
wherein the content of the first and second substances,
Figure FDA0002714588570000075
in the form of a discrete state space,
Figure FDA0002714588570000076
indicates the total number of state points after screening,
Figure FDA0002714588570000077
indicating the kth frame, the num state of the ith target,
Figure FDA0002714588570000078
respectively represent state points
Figure FDA0002714588570000079
At the k frame image ikThe transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
8b) taking the image candidate area after the screening
Figure FDA00027145885700000710
Any one of the state points
Figure FDA00027145885700000711
Obtaining the value function of the state point by an accumulation method
Figure FDA00027145885700000712
Obtaining the backtracking function of the state point by a searching method
Figure FDA00027145885700000713
Figure FDA00027145885700000714
Figure FDA00027145885700000715
Wherein
Figure FDA00027145885700000716
For image candidate regions
Figure FDA00027145885700000717
Can be transferred to a state
Figure FDA00027145885700000718
The set of state points of (a) is,
Figure FDA00027145885700000719
is shown in image ikIn the abscissa of
Figure FDA00027145885700000720
The ordinate is
Figure FDA00027145885700000721
The value of the pixel of (a) is,
Figure FDA00027145885700000722
indicates a state of
Figure FDA00027145885700000723
A value function of;
8c) repeating 8b) until the traversing is finishedSelected image candidate region
Figure FDA00027145885700000724
And obtaining the value function and the backtracking function of all the state points at each state point.
CN202011068438.5A 2020-10-09 2020-10-09 Moving target joint pre-detection tracking method of video synthetic aperture radar Active CN112083418B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011068438.5A CN112083418B (en) 2020-10-09 2020-10-09 Moving target joint pre-detection tracking method of video synthetic aperture radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011068438.5A CN112083418B (en) 2020-10-09 2020-10-09 Moving target joint pre-detection tracking method of video synthetic aperture radar

Publications (2)

Publication Number Publication Date
CN112083418A true CN112083418A (en) 2020-12-15
CN112083418B CN112083418B (en) 2022-05-17

Family

ID=73730600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011068438.5A Active CN112083418B (en) 2020-10-09 2020-10-09 Moving target joint pre-detection tracking method of video synthetic aperture radar

Country Status (1)

Country Link
CN (1) CN112083418B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114185047A (en) * 2021-12-09 2022-03-15 电子科技大学 Bistatic SAR moving target refocusing method based on optimal polar coordinate transformation
CN114415180A (en) * 2022-03-30 2022-04-29 中国人民解放军火箭军工程大学 Stable tracking method fusing SAR high-resolution image and one-dimensional range profile

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076355A (en) * 2014-07-04 2014-10-01 西安电子科技大学 Method for conducting before-detection tracking on weak and small target in strong-clutter environment based on dynamic planning
CN110361734A (en) * 2019-08-27 2019-10-22 北京无线电测量研究所 Faint moving target detection method, device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076355A (en) * 2014-07-04 2014-10-01 西安电子科技大学 Method for conducting before-detection tracking on weak and small target in strong-clutter environment based on dynamic planning
CN110361734A (en) * 2019-08-27 2019-10-22 北京无线电测量研究所 Faint moving target detection method, device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DING J ET AL.: ""Efficient Doppler ambiguity resolver for Video SAR"", 《ELECTRONICS LETTER》 *
梁健 等: ""天基视频SAR系统设计及成像算法研究"", 《中国空间科学技术》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114185047A (en) * 2021-12-09 2022-03-15 电子科技大学 Bistatic SAR moving target refocusing method based on optimal polar coordinate transformation
CN114185047B (en) * 2021-12-09 2023-06-27 电子科技大学 Double-base SAR moving target refocusing method based on optimal polar coordinate transformation
CN114415180A (en) * 2022-03-30 2022-04-29 中国人民解放军火箭军工程大学 Stable tracking method fusing SAR high-resolution image and one-dimensional range profile
CN114415180B (en) * 2022-03-30 2022-07-01 中国人民解放军火箭军工程大学 Stable tracking method fusing SAR high-resolution image and one-dimensional range profile

Also Published As

Publication number Publication date
CN112083418B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
Lim et al. Radar and camera early fusion for vehicle detection in advanced driver assistance systems
CN109459750B (en) Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision
CN104851097B (en) The multichannel SAR GMTI methods aided in based on target shape and shade
CN103064086B (en) Vehicle tracking method based on depth information
CN109375177B (en) Moving target detection method for airport scene surveillance radar system
CN112083418B (en) Moving target joint pre-detection tracking method of video synthetic aperture radar
Wen et al. Video SAR moving target detection using dual faster R-CNN
CN102778680B (en) Method for imaging uniformly accelerated motion rigid group targets based on parameterization
CN111208479B (en) Method for reducing false alarm probability in deep network detection
CN108961255B (en) Sea-land noise scene segmentation method based on phase linearity and power
CN112184749B (en) Moving target tracking method based on video SAR cross-domain combination
CN104569964A (en) Moving target two-dimensional detecting and tracking method for ultra-wideband through-wall radar
CN112991391A (en) Vehicle detection and tracking method based on radar signal and vision fusion
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN109100697B (en) Target condensation method based on ground monitoring radar system
CN113570632B (en) Small moving target detection method based on high-time-phase space-borne SAR sequential image
Yu et al. Camera-radar data fusion for target detection via Kalman filter and Bayesian estimation
CN106707278B (en) doppler beam sharpening imaging method and device based on sparse representation
CN108983194B (en) Target extraction and condensation method based on ground monitoring radar system
Jibrin et al. An object detection and classification method using radar and camera data fusion
CN110490903A (en) Multiple target fast Acquisition and tracking in a kind of Binocular vision photogrammetry
CN109917383A (en) Low signal-to-noise ratio ISAR imaging method based on echo down-sampling accumulation
CN108828549B (en) Target extraction method based on airport scene surveillance radar system
CN104537690B (en) One kind is based on the united moving spot targets detection method of maximum time index
Aguilar et al. Small moving target MOT tracking with GM-PHD filter and attention-based CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant