CN112083418A - Moving target joint pre-detection tracking method of video synthetic aperture radar - Google Patents
Moving target joint pre-detection tracking method of video synthetic aperture radar Download PDFInfo
- Publication number
- CN112083418A CN112083418A CN202011068438.5A CN202011068438A CN112083418A CN 112083418 A CN112083418 A CN 112083418A CN 202011068438 A CN202011068438 A CN 202011068438A CN 112083418 A CN112083418 A CN 112083418A
- Authority
- CN
- China
- Prior art keywords
- state
- image
- candidate
- frame
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9004—SAR image acquisition techniques
- G01S13/9005—SAR image acquisition techniques with optical processing of the SAR signals
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a tracking method before joint detection of a moving target of a video synthetic aperture radar, which mainly solves the problems of large measurement and poor detection performance of the moving target of the traditional tracking method before detection. The scheme is as follows: dividing continuous multi-frame radar observation data into image data and range-Doppler data, carrying out primary detection on the first two frames of images in the image data, and initializing a track head to form a candidate area; all state points in the previous frame of candidate area are subjected to state transition to obtain a candidate area of a next frame of target, screening the candidate area is firstly carried out by combining the image and the range-Doppler data, and then, the value function and the backtracking function are calculated; and after all frame data are processed, judging a target through comparison of a value function and a threshold value, and obtaining a predicted track of the target through a backtracking function. The method effectively reduces the calculated amount of the tracking method before detection on the premise of ensuring low false alarm and high detection rate of the maneuvering target, and can be used for the high-efficiency detection of the radar system on the shadow of the maneuvering target.
Description
Technical Field
The invention belongs to the technical field of radar signal processing, and particularly relates to a moving target joint detection method which can be used for shadow detection of a moving target by a video synthetic aperture radar.
Background
The video synthetic aperture radar (ViSAR) is a high frame rate imaging system working in a high frequency band. The video synthetic aperture radar usually adopts a beam-bunching imaging mode, obtains a high frame rate image sequence through an overlapping aperture processing technology to realize real-time observation of an imaging area, and obtains a range-doppler spectrum after processing echoes of each frame of image. In the synthetic aperture time, on one hand, due to the shielding of the target on the ground area, and on the other hand, due to the Doppler translation generated by the relative motion between the target and the radar, the moving target leaves a shadow at the actual position. Based on the characteristics, the method for detecting the moving target by using the dynamic shadow in the video synthetic aperture radar image becomes a new path for detecting the SAR moving target. The general method for detecting the target by utilizing the moving target shadow of the video synthetic aperture radar image is to eliminate background clutter through an image processing technology and then to carry out threshold-crossing judgment on each frame of data to be detected by adopting a constant false alarm technology so as to realize the detection of the target. The above procedure is generally called a detection-before-tracking technique, and its detection performance is greatly limited in a low signal-to-noise-ratio environment.
The tracking-before-detection TBD technology is a new detection technology aiming at a weak target. Compared with the traditional tracking technology after detection, the method has the advantages that the detection and tracking performance of the radar on the weak target is obviously improved by performing energy accumulation and combined processing on multi-frame continuous data and simultaneously giving out the detection and tracking results of the target. Therefore, the TBD technology is very suitable for a moving target shadow detection task of the video synthetic aperture radar, and the track-before-detection technology based on the dynamic programming strategy DP has good performance and wide application range. However, the current research on detecting the shadow of a moving target by a video synthetic aperture radar based on TBD is still incomplete, only image data is used for detection, speed information in a range-doppler spectrum is not used, so that resource waste is caused, the performance of detecting a target with strong mobility is poor, and the computation amount of the TBD technology is increased rapidly along with the increase of the number of frames, so that the computation complexity is too high. Therefore, a new TBD technology is urgently needed to realize efficient detection of the ViSAR moving target shadow.
Zhang et al, in the article "A novel approach to moving targets shadow detection in image sequence", propose a classic shadow detection method based on image processing technology, which includes image registration, speckle noise suppression, background extraction, differential processing, morphological processing, connected domain detection, and the like. The background is extracted by using multiframe information of a video synthetic aperture radar image, and then a shadow target is detected on the image with the background removed by using morphological processing. The method is computationally intensive and performance degrades severely with increasing noise.
Tian et al, in the paper, "Simultaneous Detection and Tracking of Moving-Target Shadows in VisAR image", proposes a Moving Target shadow Detection method based on a Tracking-before-Detection technique, which adopts a first-expansion and then-contraction strategy, expands the state of each frame with Gaussian distribution, contracts the state after state transition with a pixel value likelihood ratio as a basis, retains fewer large-weight particles, selects a track with the largest value-taking function as a Target track after state transition of the last frame, and completes Detection of each frame of the Target while giving the track. The method improves the distribution of the state candidate regions in the traditional TBD algorithm, enables the state candidate regions to better cover the target, and improves the detection performance of the target through multi-frame echo data accumulation. However, this method has two disadvantages, one is that the calculation amount is rapidly expanded with the increase of the frame number, and the other is that the detection performance for the strong maneuvering target is not good only by using the image data without using the range-doppler spectrum.
Disclosure of Invention
The invention aims to provide a moving target joint pre-detection tracking method of a video synthetic aperture radar to overcome the defects of the prior art, so as to reduce the calculated amount, reduce the false alarm rate of the moving target and improve the detection precision.
The technical scheme of the invention is that on a track-before-detect method based on a dynamic programming strategy, the initial state of a candidate target is automatically extracted through preprocessing, information in an image and a Doppler spectrum is jointly utilized, and a track-before-detect technology based on the dynamic programming strategy is adopted to carry out shadow detection on a moving target in a video synthetic aperture radar, and the method comprises the following implementation steps:
1. a tracking method before video synthetic aperture radar moving target joint detection is characterized by comprising the following steps:
(1) dividing original echo signals of the video synthetic aperture radar into R sub-apertures, wherein each sub-aperture comprises N pulses, and acquiring a high-resolution video synthetic aperture radar image I ═ { I ═ of each sub-aperture1,i2,...,ij,...,iRAnd the corresponding low resolution range-doppler spectrum D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j ═ 1, 2.., R;
(2) first two frames of image { i) for video synthetic aperture radar1,i2Sequentially carrying out image registration and double-parameter constant false alarm rate detection to obtain a set P of all preliminary detection points in the first two frames1And P2;
(3) Set P of preliminary detection points by distance matching1And P2All the candidate track heads of the first frame are determined, and the set of all the candidate track heads is recorded asWhereinRepresenting the ith candidate track head of the 1 st frame, wherein i is 1,2, and M is the total number of the candidate track heads;
(4) initializing a value function of the ith target frame 1 toThe backtracking function isThe image candidate area is
(5) Let the current frame be the kth frame k ≧ 2, in the kth-1 frame, letThe image candidate area of the ith target isPerforming state transition on all state points in the candidate area of the previous frame to form an image candidate area of the ith target in the kth frame
(6) Image candidate area of ith target of kth frameMapped to the range-Doppler spectrum, formed in the range-Doppler spectrum dkCandidate region in (1)
(6a) From image candidate regionsIn any one of the state pointsObtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T;
(6b) According to the true state [ x' vx' y' vy']TObtaining state points with the radar flight parametersCorresponding points in the range-doppler spectrum
(6c) Repeating (6a) to (6b) until the image candidate region is traversedForm a corresponding Doppler spectrum dkCandidate region of
(7) And (3) screening the state points:
(7a) for range-Doppler spectrum candidate regionThe sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate regionAnd according to the corresponding relation between the image and the Doppler spectrum, selecting the range-Doppler spectrum candidate areaObtaining new image candidate area
(7b) Setting a minimum amplitude threshold TminNew candidate regions of the imageAll the state points with the middle amplitude smaller than the threshold are deleted to form the final image candidate areaThen the final image candidate area is usedAssigning to the image candidate area of the ith target in the kth frameScreening of state points in the image is achieved;
(8) the filtered ones are obtained by accumulation and search respectivelyAll state points inValue function ofAnd a backtracking functionWhereinIndicating the kth frame, the num state of the ith target, is composed ofThe total number of intermediate state points;
(9) repeating (5) to (8) until k is equal to R, and taking the maximum value function in the R-th frameWith a set threshold VTAnd (3) comparison:the false alarm is considered as the initial detection false alarm, no target is found, otherwise, the target is considered as the target, and the backtracking function is completed by updatingContinuously backtracking the state of each frame to obtain the predicted track of the ith target in total R framesWhereinA jth frame prediction state representing an ith target;
(10) and (5) repeating the steps (4) to (9) in sequence until all the candidate targets are traversed to obtain the candidate tracks of all the targets, and completing the detection of the targets.
Compared with the prior art, the invention has the following advantages:
1) the invention greatly improves the detection performance of the weak target by accumulating the energy of multi-frame continuous data and gets rid of the limitation of the traditional single-frame detection because the pre-detection tracking algorithm is applied to the shadow detection of the moving target of the video SAR.
2) According to the invention, the initial state of the candidate target is automatically extracted by sequentially carrying out image registration, double-parameter constant false alarm detection and distance matching on the first two frames of images, so that the initial detection false alarm rate is reduced, the detection efficiency is improved, and the whole tracking process is more automatic and intelligent.
3) According to the invention, because the position information of the target in the image and the speed information of the target in the range Doppler spectrum are combined, on one hand, candidate points with unmatched speed and incorrect positions are deleted, the algorithm running time is greatly reduced, on the other hand, more state points conforming to the target motion rule are reserved, and the shadow detection tracking performance of the maneuvering target is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a usage scenario diagram of the present invention;
FIG. 3 is a diagram of the actual imaging geometry in the present invention;
fig. 4 is a diagram of simulation results of moving target detection in video synthetic aperture radar data using the present invention.
Detailed Description
Embodiments and effects of the present invention will be further described below with reference to the accompanying drawings.
Referring to fig. 1, the implementation steps of the present invention are as follows:
step one, obtaining a high-resolution video synthetic aperture radar image and a corresponding low-resolution range-Doppler spectrum.
1.1) setting the complex echo signal of the video synthetic aperture radar as an echo matrix of J range gates and Z pulses, and dividing the complex echo signal by using an overlapped sliding window with the length of N and the step length of SIs divided into R sub-apertures, whereinRepresenting a rounding-down operation, S < N;
1.2) imaging N pulses in each sub-aperture by adopting a polar coordinate format imaging algorithm to obtain a high-resolution video synthetic aperture radar image, extracting W continuous pulses from the center of each sub-aperture, and sequentially performing range compression and azimuth Fourier transform to obtain a corresponding low-resolution range Doppler spectrum;
1.3) processing the echo data of each sub-aperture according to the mode 1.2), and finally obtaining R frame image data I ═ { I ═ I1,i2,...,ij,...,iRD and R frame range doppler data D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j 1, 2.
Step two, two previous frame images { i1,i2On the point, a set P of all the preliminary detection points is obtained1And P2。
2.1) will be two previous frame images i1,i2Using adjacent frames of images as reference images, respectively registering the adjacent frames of images with the reference images, and carrying out difference on the images after the registration of the frames of images and the reference images to realize moving target extraction so as to obtain two preprocessed frames of images { i }1',i2'};
2.2) setting the false alarm rate of the preliminary detection to be faCalculating the threshold T of the double-parameter constant false alarm detectorhComprises the following steps:
2.3) setting the first frame image i after the preprocessing1The point to be detected is p ═ 1, (x, y), Nx,y=1,...,Ny,NxAnd NyAre respectively i1Two dimensions of'; setting the detection area of the point p to be detected as theta, and calculating the mean value mu and standard in the detection area thetaThe tolerance σ is:
wherein p isiAs a point in the detection zone Θ,total number of all points in the detection area, i1'(pi) Represents a point piIn the first frame image i after preprocessing1A pixel value on';
2.4) calculating the detection quantity T of the point p to be detected according to the calculation result in 2.3):
2.5) comparing the detected quantity T with a threshold ThComparing and judging whether the point p to be detected has a target or not; if T is greater than or equal to ThIf not, judging that the target exists in the detection point;
2.6) traversing the preprocessed first frame image i1' all detection points in the sequence obtain a first frame image i1After morphological closing operation, opening operation and connected domain processing are sequentially carried out on the constant false alarm detection result, the centroid coordinate of each connected domain is extracted to obtain a set P of first frame preliminary detection points1={(xi,yi),i=1,...,U1In which U is1Is P1Detecting the total number of the points;
2.7) in the second frame image i2Repeat 2.2) to 2.6) above to obtain a set P of second frame preliminary detection points2={(xj,yj),j=1,...,U2In which U is2Is P2The total number of detection points is detected.
Step three, using distance matching method to obtain the set P of preliminary detection points from the first frame1And a set of preliminary detection points P for a second frame2A set C of all candidate track heads of the first frame is determined.
3.1) setting the distance range of two adjacent frames of the moving target to be 0, Lmax]Set of candidate flight path headsWherein L ismaxThe maximum distance between two adjacent frames of the moving target is obtained;
3.2) calculating the first frame preliminary detection point set P1At a certain point (x)i,yi) And a second frame preliminary detection point set P2Neutral (x)i,yi) Closest point (x)near,ynear) Distance L of (a):
3.3) judging whether the distance L is [0, Lmax]Within the range: if not, the point (x) is addedi,yi) Discard as false alarm point, if it is, calculate point (x)i,yi) Velocity (v) ofxi.vyi):
Where t is the time difference between two adjacent frames, vxiIs a point (x)i,yi) Transverse velocity of vyiIs a point (x)i,yi) Longitudinal speed of (d);
3.4) initial state of the object [ x ]i vxi yi vyi]TAdding to a candidate track head C, wherein T represents transposition;
3.5) repeat 3.2) to 3.4) until P1All the points in the navigation path are traversed to obtain the final candidate flight path headThe initial state of the ith candidate target in the first frame is represented, and i is 1, 2.
Step four, initializing the value function of the ith target frame 1Backtracking functionImage candidate region
4.1) setting the initial state of the ith candidate target in the first frameThen value functionComprises the following steps:
wherein the content of the first and second substances,are respectively in an initial stateThe abscissa, the lateral velocity, the ordinate, the longitudinal velocity,indicating pointsIn the first frame image i1The value of the pixel of (a) above,indicating a stateWherein n is 1,2,3, 4;
Step five, the image candidate area in the previous frame is processedAll the state points are subjected to state transition to form an image candidate area of the ith target in the k frames
whereinIn the form of a discrete state space,indicating the num state of the ith target in the k-1 th frame,representing the total number of states of the ith object of the (k-1) th frame,respectively represent statesIn the k-1 th frame ik-1The transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
5.2) adopting a uniform speed transfer model to carry out image candidate areaAny one of the state pointsCarrying out state transition to obtain the num state of the ith target in the kth frame
x∈{x1,x2,…,xn}
y∈{y1,y2,…,ym}
Wherein, I2Is a matrix of a second-order unit,representing the Kronecker product of the matrix,xandyacceleration process noise in x-and y-directions, respectively, the x-direction acceleration process noise being collectedx1,x2,…,xnMeans forxCan selectRange of (1), y-direction acceleration process noise sety1,y2,…,ymMeans forySelectable range, n representing the setx1,x2,…,xnTotal number of elements in the gallery, m representing a collectiony1,y2,…,ymThe total number of elements in the transpose, T, denotes transpose.
5.3) Accelerate the course noise set from x-x1,x2,…,xnTaking any element as x-direction acceleration process noisexValue of (c), acceleration process noise set from yy1,y2,…,ymTaking any element as y-direction acceleration process noiseyUntil all the combination modes of the noise of the acceleration process are traversed, namely the value of (1) can be at the state pointAfter the state transition, generating n × m different state transition points, which are formed in the ith sub-candidate region of the ith target in the kth frame
5.4) repeat 5.2) to 5.3) until the image candidate area is traversedAll the state points after state transition form the image candidate area of the ith target of the kth frame
Wherein the content of the first and second substances,respectively represent state pointsIn the k frame ikHorizontal coordinate, horizontal speed, vertical coordinate, vertical speed;the number of all states;the η -th sub-candidate representing the ith target of the kth frame,t denotes transposition for all the sub candidate areas.
Step six, candidate areas in the image are obtainedMapping into range-Doppler spectrum to form candidate regions in range-Doppler spectrum
6.1) from the image candidate regionIn any one of the state pointsObtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T:
Referring to fig. 2, the area ABCD represents an image ikThe area EFGH represents a scene image, and this step is implemented as follows,
6.1.1) from candidate regionsZhong renGet a state pointThe position of which corresponds to point P in FIG. 2, the abscissa of which is x0Ordinate is y0:
Wherein the content of the first and second substances,is the abscissa of the state point and is,is the ordinate of the state point and,indicating a stateThe first element of (a) is,indicating a stateThe third element of (1);
6.1.2) coordinates (x) of point P by geometric relationship0,y0) Transformation to coordinates (x) in corresponding scene images1,y1):
Wherein, theta is the rotation angle of the radar platform relative to the scene at the moment, N represents the number of pulses, J represents the number of range gates, and alpha is x0-N/2,β=y0-J/2,x1Indicates the state pointAbscissa, y, in the scene image1Representing the ordinate of the state point in the scene image;
6.1.3) scaling the coordinates (x) by scaling1,y1) The actual position coordinates (x ', y') converted into the stationing coordinate system are:
where ρ isrRepresenting range-wise resolution, ρ, of a video synthetic aperture radar systemαIndicating the azimuth resolution of the video synthetic aperture radar system, x' indicating the stateActual abscissa in the dotted coordinate system, y' denotes the stateAn actual ordinate in a stationing coordinate system;
6.1.4) pairs of State pointsPerforming state transition to obtain the state of the next frame of target, converting to the actual position coordinate in the stationing coordinate system according to 6.1.1) to 6.1.3), and obtaining the state point according to the position difference of the two points and the time interval of the two framesActual velocity v ofx',vy', reuse x ', y ', vx',vy'composition true State [ x' vx' y' vy']T。
6.2) according to the true state [ x' v ]x' y' vy']TObtaining state points with the radar flight parametersAt a large distanceCorresponding points in the plerian spectrum
Referring to fig. 3, this step is implemented as follows:
6.2.1) the radar moves from point A to point B at a constant speed in a circular manner to generate an image ikThe radar position coordinate of time is (x)R,yR,zR) At a speed ofSetting the coordinate vector of the K point in the actual scene corresponding to the target as (x ', y',0) and the velocity vector asCalculating the normalized slope distance vector of the connecting line of the target and the radarComprises the following steps:
wherein, | - | represents the modulus of the vector;
6.2.2) according to the slant range vectorSpeed of rotationAnd velocityCalculating the radial velocity v of the moving target K relative to the radarRK:
Wherein · represents a dot product;
6.2.3) according to the radial directionVelocity vRKCalculating the Doppler frequency f of the moving target K relative to the radarDKComprises the following steps:
wherein λ represents the wavelength of the radar-transmitted signal;
6.2.4) obtaining the state points according to the calculation results in 6.2.3)In the Doppler spectrum dkPoint corresponding toAbscissa ofAnd the ordinateComprises the following steps:
6.3) repeating steps 6.1.1) to 6.2.4) until the candidate region is traversedAfter traversing, all the state points form corresponding state pointsTaylor spectrum dkCandidate region ofNamely:
whereinRepresenting a discrete doppler space of a doppler space,respectively representing pointsThe lateral coordinate and the longitudinal coordinate on the doppler spectrum,representing candidate regionsThe total number of upper states.
And step seven, screening candidate points.
7.1) candidate regions of the range-Doppler spectraThe sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate regionAnd corresponding new image candidate regions
7.1.1) image candidate area according to the corresponding relation of the step sixEach sub-candidate region in (1)In the Doppler spectrum candidate regionThe sub-candidate region corresponding to the above isWherein the range-Doppler spectrum candidate regionExpressed as:for range-Doppler spectrum candidateThe nth sub-candidate region of (a),is a sub-candidate regionThe total number of (c);
7.1.2) optionally taking a sub-candidate regionSub-candidate regionsAccording to which the candidate point in (1) is in the Doppler spectrum dkThe upper amplitude values are sorted from large to small, and the first Ns candidate points are taken to form a screened sub candidate areaNs≥16;
7.1.3) repeating the step 7.1.2) until all the sub candidate areas are traversed to form a screened range-Doppler spectrum candidate area
7.1.4) obtaining a distance Doppler spectrum candidate area after screening according to the corresponding relation between the image and the Doppler spectrumCorresponding new image candidate areaFinishing the screening of candidate points in the Doppler spectrum;
7.2) for new image candidate regionsScreening the candidate points in (1): setting the amplitude threshold as TminWill select the area afterAll the pixel values are less than TminAll candidate points are deleted to obtain an image candidate area after screening is finished
Step eight, calculating the image candidate area after the screening is finishedValue functions and backtracking functions of all state points in the set.
wherein the content of the first and second substances,in the form of a discrete state space,indicates the total number of state points after screening,indicating the kth frame, the num state of the ith target,respectively represent state pointsAt the k frame image ikThe transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
8.2) taking the image candidate area after the screeningAny one of the state pointsObtaining the value function of the state point by an accumulation methodObtaining the backtracking function of the state point by a searching method
WhereinFor image candidate regionsCan be transferred to a stateThe set of state points of (a) is,is shown in image ikIn the abscissa ofThe ordinate isThe value of the pixel of (a) is,indicates a state ofA value function of;
8.3) repeating 8.2) until the image candidate area after the screening is traversedAnd obtaining the value function and the backtracking function of all the state points at each state point.
And step nine, judging whether the target exists or not through the maximum value in the value function.
9.1) setting a detection threshold VT;
9.2) repeating the fifth step to the eighth step in turn until k is equal to R, and taking the function of the value of the R-th frameThe median maximum is recorded asCompares it with a detection threshold VTFor comparison, ifIf so, executing step ten, otherwise, jumping to step eleven without the target:
step ten, obtaining the predicted track of the ith target common R frame through a backtracking function
10.1) predicted State of the ith target, when target is detected, the ith target, the Rth frameThe state point with the largest value function in the R-th frame is represented, and the prediction states of the rest frames can be obtained through a backtracking function, and are represented as follows:
10.2) repeating the step 10.1) until k is equal to 1, and obtaining a predicted track of the ith target R-frame
And step eleven, sequentially repeating the step four to the step ten until all the candidate targets are traversed to obtain the candidate tracks of all the targets.
The effect of the invention can be further illustrated by the following simulation:
the simulation parameters are as shown in the table 1:
TABLE 1 simulation parameters
Secondly, simulation content:
the simulation parameters shown in table 1 are used to simulate the original echo of the video synthetic aperture radar, the original echo of the radar is imaged in a polar coordinate format to obtain a high-resolution video synthetic aperture radar image and a corresponding low-resolution range-doppler spectrum, the moving target shadow in the high-resolution video synthetic aperture radar image is detected by using the joint pre-detection tracking method provided by the invention, and the result is shown in fig. 4, wherein:
fig. 4(a) shows an original image, in which 5 moving target shadows left by moving targets exist, that is, a target 1 shows a leftward curvilinear motion, a target 2 shows a rightward curvilinear motion, a target 3 shows a linear motion that is accelerated and then decelerated, a target 4 shows a linear motion that is decelerated and then accelerated, and a target 5 shows a uniform linear motion throughout the entire range;
FIG. 4(b) is a column image showing the shadow detection result of a moving object by the method of the present invention;
figure 4(c) is a column image showing the results of detection in the corresponding doppler spectra using the method of the present invention.
As can be seen from the images in fig. 4(b), the method of the present invention realizes accurate tracking and detection of the shadows of 5 strong maneuvering targets, and the positions of the shadows of the 5 strong maneuvering targets are marked by boxes, and no false alarm target appears.
As can be seen from the images in fig. 4(c), the accurate coverage of the positions of 5 strong maneuvering targets in the doppler spectrum is achieved by the method of the present invention.
The foregoing description is only a specific example of the present invention and is not intended to limit the invention, so that it will be apparent to those skilled in the art that various changes and modifications in form and detail may be made therein without departing from the spirit and scope of the invention.
Claims (8)
1. A tracking method before video synthetic aperture radar moving target joint detection is characterized by comprising the following steps:
(1) dividing original echo signals of the video synthetic aperture radar into R sub-apertures, wherein each sub-aperture comprises N pulses, and acquiring a high-resolution video synthetic aperture radar image I ═ { I ═ of each sub-aperture1,i2,...,ij,...,iRAnd the corresponding low resolution range-doppler spectrum D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j ═ 1, 2.., R;
(2) first two frames of image { i) for video synthetic aperture radar1,i2Sequentially carrying out image registration and double-parameter constant false alarm rate detection to obtain a set P of all preliminary detection points in the first two frames1And P2;
(3) Set P of preliminary detection points by distance matching1And P2All the candidate track heads of the first frame are determined, and the set of all the candidate track heads is recorded asWhereinRepresents the ith candidate track head of the 1 st frame, i is 1,2The total number;
(4) initializing a value function of the ith target frame 1 toThe backtracking function isThe image candidate area is
(5) Let the current frame be the kth frame k ≧ 2, and in the kth-1 frame, let the image candidate region of the ith target bePerforming state transition on all state points in the candidate area of the previous frame to form an image candidate area of the ith target in the kth frame
(6) Image candidate area of ith target of kth frameMapped to the range-Doppler spectrum, formed in the range-Doppler spectrum dkCandidate region in (1)
(6a) From image candidate regionsIn any one of the state pointsObtaining a true state [ x' v ] by geometric transformation, scaling and state transitionx' y' vy']T;
(6b) According to the true state [ x' vx' y' vy']TObtaining state points with the radar flight parametersCorresponding points in the range-doppler spectrum
(6c) Repeating (6a) to (6b) until the image candidate region is traversedForm a corresponding Doppler spectrum dkCandidate region of
(7) And (3) screening the state points:
(7a) for range-Doppler spectrum candidate regionThe sub candidate regions in the spectrum are screened to obtain the screened range-Doppler spectrum candidate regionAnd according to the corresponding relation between the image and the Doppler spectrum, selecting the range-Doppler spectrum candidate areaObtaining new image candidate area
(7b) Setting a minimum amplitude threshold TminNew candidate regions of the imageAll the state points with the middle amplitude smaller than the threshold are deleted to form the final image candidate areaThen the final image candidate area is usedAssigning to the image candidate area of the ith target in the kth frameScreening of state points in the image is achieved;
(8) the filtered ones are obtained by accumulation and search respectivelyValue function of all state points inAnd a backtracking functionWhereinIndicating the kth frame, the num state of the ith target,is composed ofThe total number of intermediate state points;
(9) repeating (5) to (8) until k is equal to R, and taking the maximum value function in the R-th frameWith a set threshold VTAnd (3) comparison:the false alarm is considered as the initial detection false alarm, no target is found, otherwise, the target is considered as the target, and the backtracking function is completed by updatingContinuously backtracking the state of each frame to obtain the predicted track of the ith target in total R framesWhereinA jth frame prediction state representing an ith target;
(10) and (5) repeating the steps (4) to (9) in sequence until all the candidate targets are traversed to obtain the candidate tracks of all the targets, and completing the detection of the targets.
2. The method of claim 1, wherein the obtaining of the high resolution video synthetic aperture radar image and the corresponding low resolution range-doppler spectrum for each sub-aperture in (1) is performed as follows:
1a) setting a complex echo signal of a video synthetic aperture radar as an echo matrix of J range gates and Z pulses, and dividing the complex echo signal into R sub-apertures by using an overlapped sliding window with the length of N and the step length of S, wherein Representing a rounding-down operation, S < N;
1b) imaging N pulses in each sub-aperture by adopting a polar coordinate format imaging algorithm to obtain a high-resolution video synthetic aperture radar image, extracting W continuous pulses from the center of each sub-aperture, and sequentially performing range-direction compression and azimuth-direction Fourier transform to obtain a corresponding low-resolution range Doppler spectrum;
1c) processing the echo data of each sub-aperture according to the mode 1b), and finally obtaining R frame image data I ═ { I ═1,i2,...,ij,...,iRD and R frame range doppler data D ═ D1,d2,...,dj,...,dRIn which ijRepresenting the j-th frame image, djRepresents the jth range-doppler spectrum, j 1, 2.
3. The method according to claim 1, wherein in (3) the set P of preliminary detection points is distance-matched from the set of preliminary detection points1And P2Determining all candidate track heads of the first frame, which is implemented as follows:
3a) setting the interval range of two adjacent frames of the moving target as 0, Lmax]Set of candidate flight path headsWherein L ismaxThe maximum distance between two adjacent frames of the moving target is obtained;
3b) calculating a first frame preliminary detection point set P1At a certain point (x)i,yi) And a second frame preliminary detection point set P2Neutral (x)i,yi) Closest point (x)near,ynear) Distance L of (a):
3c) judging whether the distance L is [0, L ]max]Within the range: if not, the point (x) is addedi,yi) Discard as false alarm point, if it is, calculate point (x)i,yi) Velocity (v) ofxi.vyi):
Where t is the time difference between two adjacent frames, vxiIs a point (x)i,yi) Transverse velocity of vyiIs a point (x)i,yi) Longitudinal speed of (d);
3d) initial state of target [ xi vxi yi vyi]TAdding to a candidate track head C, wherein T represents transposition;
4. The method of claim 1, wherein (5) the candidate region of the image in the previous frame is selectedAll the state points are subjected to state transition to form an image candidate area of the ith target in the k framesThe implementation is as follows:
whereinIn the form of a discrete state space,indicating the num state point of the ith target in the k-1 th frame,representing the total number of states of the ith object of the (k-1) th frame,respectively represent statesIn the k-1 th frame ik-1The transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
5b) adopting a uniform velocity transfer model to make the image candidate regionAny one of the state pointsPerforming a state transition to obtain
x∈{x1,x2,…,xn}
y∈{y1,y2,…,ym}
Wherein, I2Is a matrix of a second-order unit,representing the Kronecker product of the matrix,xandythe noise of acceleration process in x-and y-directions, respectively, being collectedx1,x2,…,xnMeans forxSelectable range, sety1,y2,…,ymMeans foryA selectable range, T denotes transpose;
5c) repeat 5b) until traversalAll the state points after state transition form the image candidate area of the ith target of the kth frame
Wherein the content of the first and second substances,respectively represent state pointsIn the k frame ikHorizontal coordinate, horizontal speed, vertical coordinate, vertical speed;the number of all states;the η -th sub-candidate representing the ith target of the kth frame,t denotes transposition for all the sub candidate areas.
5. The method of claim 1, wherein (6a) the candidate regions are to be selected from imagesMiddle arbitrary state pointMaking transfer to obtain real state x' vx' y' vy']TIt is implemented as follows:
6a1) setting a slave image candidate areaIn any one of the state pointsCoordinate (x) of0,y0) Expressed as:
wherein the content of the first and second substances,is the abscissa of the state point and is,is the ordinate of the state point and,indicating a stateThe first element of (a) is,indicating a stateThe third element of (1);
6a2) by geometric relationship, coordinate (x)0,y0) Transformation to coordinates (x) in corresponding scene images1,y1):
Wherein, theta is the rotation angle of the radar platform relative to the scene at the moment, N represents the number of pulses, J represents the number of range gates, and alpha is x0-N/2,β=y0-J/2,x1Indicates the state pointAbscissa, y, in the scene image1Representing the ordinate of the state point in the scene image;
6a3) by scaling, the coordinates (x)1,y1) The actual position coordinates (x ', y') converted into the stationing coordinate system are:
where ρ isrRepresenting range-wise resolution, ρ, of a video synthetic aperture radar systemαIndicating the azimuth resolution of the video synthetic aperture radar system, x' indicating the stateActual abscissa in the dotted coordinate system, y' denotes the stateAn actual ordinate in a stationing coordinate system;
6a4) for state pointPerforming state transition to obtain the state of the next frame of target, converting to the actual position coordinate in the stationing coordinate system according to 6a1) to 6a3), and obtaining the state point according to the position difference of the two points and the time interval of the two framesActual velocity v ofx',vy', by x ', y ', vx',vy'composition true State [ x' vx' y' vy']T。
6. The method of claim 1, wherein [ x' v ] is determined in (6b) according to true statex' y' vy']TObtaining state points with the radar flight parametersCorresponding points in the range-doppler spectrumThe implementation is as follows:
6b1) generating image i by performing uniform-speed circular motion according to radarkPosition coordinates (x) of timeR,yR,zR) And velocitySetting the coordinate vector of the K point in the actual scene corresponding to the target as (x ', y',0) and the velocity vector asCalculating the normalized slope distance vector of the connecting line of the target and the radarComprises the following steps:
wherein, | - | represents the modulus of the vector;
6b2) according to the vector of the slant distanceSpeed of rotationAnd velocityCalculating the radial velocity v of the moving target K relative to the radarRK:
Wherein · represents a dot product;
6b3) according to radial velocity vRKCalculating the Doppler frequency f of the moving target K relative to the radarDKComprises the following steps:
wherein λ represents the wavelength of the radar-transmitted signal;
6b4) from the calculation result in 6b3), the state point is obtainedIn the Doppler spectrum dkPoint corresponding toAbscissa ofAnd the ordinateComprises the following steps:
7. The method of claim 1, wherein (7a) the range-doppler spectrum candidate regions are pairedThe sub-candidate regions in (a) are screened, which is achieved as follows:
whereinRepresenting range-doppler spectrum candidate regionsThe nth sub-candidate region of (a),representing sub-candidate regionsThe total number of (c);
7a2) arbitrarily choose a sub-candidate regionSub-candidate regionsAccording to which the candidate point in (1) is in the Doppler spectrum dkThe upper amplitude values are sorted from large to small, and the first Ns candidate points are taken to form a screened sub candidate area
7a3) Repeating the step 7a2) until all the sub candidate regions are traversed to form a filtered range-Doppler spectrum candidate region
8. The method according to claim 1, wherein the candidate regions of the filtered image are obtained by accumulating and searching in (8)Value function sum of all state points inA backtracking function implemented as follows:
wherein the content of the first and second substances,in the form of a discrete state space,indicates the total number of state points after screening,indicating the kth frame, the num state of the ith target,respectively represent state pointsAt the k frame image ikThe transverse coordinate, the transverse speed, the longitudinal coordinate and the longitudinal speed on the upper surface, T represents transposition;
8b) taking the image candidate area after the screeningAny one of the state pointsObtaining the value function of the state point by an accumulation methodObtaining the backtracking function of the state point by a searching method
WhereinFor image candidate regionsCan be transferred to a stateThe set of state points of (a) is,is shown in image ikIn the abscissa ofThe ordinate isThe value of the pixel of (a) is,indicates a state ofA value function of;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011068438.5A CN112083418B (en) | 2020-10-09 | 2020-10-09 | Moving target joint pre-detection tracking method of video synthetic aperture radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011068438.5A CN112083418B (en) | 2020-10-09 | 2020-10-09 | Moving target joint pre-detection tracking method of video synthetic aperture radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112083418A true CN112083418A (en) | 2020-12-15 |
CN112083418B CN112083418B (en) | 2022-05-17 |
Family
ID=73730600
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011068438.5A Active CN112083418B (en) | 2020-10-09 | 2020-10-09 | Moving target joint pre-detection tracking method of video synthetic aperture radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112083418B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114185047A (en) * | 2021-12-09 | 2022-03-15 | 电子科技大学 | Bistatic SAR moving target refocusing method based on optimal polar coordinate transformation |
CN114415180A (en) * | 2022-03-30 | 2022-04-29 | 中国人民解放军火箭军工程大学 | Stable tracking method fusing SAR high-resolution image and one-dimensional range profile |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104076355A (en) * | 2014-07-04 | 2014-10-01 | 西安电子科技大学 | Method for conducting before-detection tracking on weak and small target in strong-clutter environment based on dynamic planning |
CN110361734A (en) * | 2019-08-27 | 2019-10-22 | 北京无线电测量研究所 | Faint moving target detection method, device, computer equipment and storage medium |
-
2020
- 2020-10-09 CN CN202011068438.5A patent/CN112083418B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104076355A (en) * | 2014-07-04 | 2014-10-01 | 西安电子科技大学 | Method for conducting before-detection tracking on weak and small target in strong-clutter environment based on dynamic planning |
CN110361734A (en) * | 2019-08-27 | 2019-10-22 | 北京无线电测量研究所 | Faint moving target detection method, device, computer equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
DING J ET AL.: ""Efficient Doppler ambiguity resolver for Video SAR"", 《ELECTRONICS LETTER》 * |
梁健 等: ""天基视频SAR系统设计及成像算法研究"", 《中国空间科学技术》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114185047A (en) * | 2021-12-09 | 2022-03-15 | 电子科技大学 | Bistatic SAR moving target refocusing method based on optimal polar coordinate transformation |
CN114185047B (en) * | 2021-12-09 | 2023-06-27 | 电子科技大学 | Double-base SAR moving target refocusing method based on optimal polar coordinate transformation |
CN114415180A (en) * | 2022-03-30 | 2022-04-29 | 中国人民解放军火箭军工程大学 | Stable tracking method fusing SAR high-resolution image and one-dimensional range profile |
CN114415180B (en) * | 2022-03-30 | 2022-07-01 | 中国人民解放军火箭军工程大学 | Stable tracking method fusing SAR high-resolution image and one-dimensional range profile |
Also Published As
Publication number | Publication date |
---|---|
CN112083418B (en) | 2022-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Lim et al. | Radar and camera early fusion for vehicle detection in advanced driver assistance systems | |
CN109459750B (en) | Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision | |
CN104851097B (en) | The multichannel SAR GMTI methods aided in based on target shape and shade | |
CN103064086B (en) | Vehicle tracking method based on depth information | |
CN109375177B (en) | Moving target detection method for airport scene surveillance radar system | |
CN112083418B (en) | Moving target joint pre-detection tracking method of video synthetic aperture radar | |
Wen et al. | Video SAR moving target detection using dual faster R-CNN | |
CN102778680B (en) | Method for imaging uniformly accelerated motion rigid group targets based on parameterization | |
CN111208479B (en) | Method for reducing false alarm probability in deep network detection | |
CN108961255B (en) | Sea-land noise scene segmentation method based on phase linearity and power | |
CN112184749B (en) | Moving target tracking method based on video SAR cross-domain combination | |
CN104569964A (en) | Moving target two-dimensional detecting and tracking method for ultra-wideband through-wall radar | |
CN112991391A (en) | Vehicle detection and tracking method based on radar signal and vision fusion | |
CN114280611A (en) | Road side sensing method integrating millimeter wave radar and camera | |
CN109100697B (en) | Target condensation method based on ground monitoring radar system | |
CN113570632B (en) | Small moving target detection method based on high-time-phase space-borne SAR sequential image | |
Yu et al. | Camera-radar data fusion for target detection via Kalman filter and Bayesian estimation | |
CN106707278B (en) | doppler beam sharpening imaging method and device based on sparse representation | |
CN108983194B (en) | Target extraction and condensation method based on ground monitoring radar system | |
Jibrin et al. | An object detection and classification method using radar and camera data fusion | |
CN110490903A (en) | Multiple target fast Acquisition and tracking in a kind of Binocular vision photogrammetry | |
CN109917383A (en) | Low signal-to-noise ratio ISAR imaging method based on echo down-sampling accumulation | |
CN108828549B (en) | Target extraction method based on airport scene surveillance radar system | |
CN104537690B (en) | One kind is based on the united moving spot targets detection method of maximum time index | |
Aguilar et al. | Small moving target MOT tracking with GM-PHD filter and attention-based CNN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |