CN103995258B - Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings - Google Patents

Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings Download PDF

Info

Publication number
CN103995258B
CN103995258B CN201410248860.7A CN201410248860A CN103995258B CN 103995258 B CN103995258 B CN 103995258B CN 201410248860 A CN201410248860 A CN 201410248860A CN 103995258 B CN103995258 B CN 103995258B
Authority
CN
China
Prior art keywords
reference data
data
clutter
namely
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410248860.7A
Other languages
Chinese (zh)
Other versions
CN103995258A (en
Inventor
简涛
何友
熊伟
苏峰
平殿发
王海鹏
周强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Naval Aeronautical University
Original Assignee
Naval Aeronautical Engineering Institute of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Naval Aeronautical Engineering Institute of PLA filed Critical Naval Aeronautical Engineering Institute of PLA
Priority to CN201410248860.7A priority Critical patent/CN103995258B/en
Publication of CN103995258A publication Critical patent/CN103995258A/en
Application granted granted Critical
Publication of CN103995258B publication Critical patent/CN103995258B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter

Abstract

The invention discloses radar target self-adaptation fusion detection method under a kind of complex clutter peripheral surroundings, belong to radar signal processing field.Under complex clutter environment before and after clutter edge may appear at simultaneously in sliding window, the deficiencies such as existing CFAR detection method false alarm control capability is poor in order to overcome, detection perform declines, operand is excessive, the present invention proposes radar target self-adaptation fusion detection method under a kind of complex clutter peripheral surroundings; The method carries out homogeneity judgement based on neighboring reference data difference, the change of clutter edge number and position can be adaptive to, judge clutter edge and range unit relative position to be detected, filter out the homogeneous reference data that can represent range unit clutter background to be detected, and merge formation detection threshold; While maintenance constant false alarm rate characteristic, effectively improve the false alarm control capability of radar in complex clutter peripheral surroundings and detection perform, avoid circulation or iterative operation, calculated amount is little, is convenient to Project Realization.

Description

Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings
Technical field
The present invention is under the jurisdiction of radar signal processing field, is specifically related to radar target self-adaptation fusion detection method under a kind of complex clutter peripheral surroundings.
Background technology
Radar is the important tool utilizing radio-wave sounding target in military and civilian field.False-alarm problem during radar detects automatically is one of unavoidable major issue in radar system design.Constant false alarm rate (CFAR) technology is the important means controlling false alarm rate in radar automatic checkout system, plays an important role in the automatic testing process of radar.The object of CFAR design is to provide and is adapted to background and mixes and make an uproar and the detection threshold of interference variations, makes Automatic Targets process have constant false-alarm probability.
What clutter edge described is the interregional zone of transition situation of detection background different qualities, and the exemplary of this situation is the edge of rainfall area, land, ocean intersection etc.If detecting unit is in weak clutter district, and be in strong clutter district with reference to some other reference unit in sliding window, even if so signal to noise ratio (S/N ratio) also can produce coating effect to target detection very greatly, detection probability and false-alarm probability all can decline.If detecting unit is in strong clutter district, and some other reference unit is in weak clutter district, so false-alarm probability can sharply rise.This problem is the major issue should considered in Radar Design, and classical cell-average (CA)-CFAR detecting device detection perform under clutter edge environment sharply declines and false alarm rate too rises.As the amendment scheme of CA-CFAR detecting device, large (the GO)-CFAR detection method of choosing is specially for the solution of clutter edge situation, GO-CFAR method slides the greater in window two partial estimation before and after getting, as the clutter power horizontal estimated that detecting device is total, the situation that clutter edge appears at single sliding window can be successfully managed; But when clutter local heterogeneity increases, before and after occurring most probably all there are the complex situations of clutter edge in sliding window, and now the false alarm control capability of GO-CFAR method declines, and cannot meet CFAR testing requirement.In addition, Ordered Statistic (OS) class CFAR detection method is based on the ordered sequence of reference sample, unknown clutter power intensity is estimated by selecting representational reference data, effectively improve the multi-sources distinguishing ability of detection method, but the false alarm control capability of these class methods in clutter edge is still difficult to meet CFAR testing requirement.In addition, current most of self-adaptation CFAR detection method selects representative reference data by loop iteration deleting technique, the complex loops wherein related to or iterative operation result in huge calculated amount, constrain the effective application of this kind of self-adaptation CFAR detection method under complex clutter edge background.
In practical application, before and after clutter edge may appear at simultaneously in sliding window.For the average power level obtaining range unit clutter background to be detected is estimated, the relative position of clutter edge and range unit to be detected need being determined, selecting the reference unit that can represent range unit clutter background to be detected, for calculating detection threshold.Because clutter edge may appear in the sliding window in front and back simultaneously, and its locus is also unknown, therefore, accurate judgement clutter edge position, available information in effective integration reference data, simplicity of design efficient adaptive CFAR detection method, makes it to be adaptive to clutter edge change, has important practical significance.
Summary of the invention
1. the technical matters that will solve
Under complex clutter peripheral surroundings, poor in order to overcome existing CFAR detection method false alarm control capability, detection perform declines, the problems such as operand is excessive, the present invention is based on neighboring reference data difference and carry out homogeneity judgement, filter out and can represent range unit to be detected and to mix the homogeneous reference data of background of making an uproar, and merge formation detection threshold, radar target self-adaptation fusion detection method under proposition complex clutter peripheral surroundings, accurately judging on the basis of clutter edge and range unit relative position to be detected, improve the false alarm control capability of radar in complex clutter peripheral surroundings and detection perform.
2. technical scheme
Under complex clutter peripheral surroundings of the present invention, radar target self-adaptation fusion detection method comprises following technical measures:
Step 1 obtains reference data, carries out variance evaluation, calculates neighboring reference data difference
Centered by range unit observed reading x to be detected, before and after it, get the radar return video observing measured value of M range unit respectively, form 2M reference data (x m, m=1,2 ..., 2M);
Calculate the estimate of variance of all 2M reference data namely
σ ^ 2 = 1 2 M Σ m = 1 2 M ( x m - 1 2 M Σ k = 1 2 M x k ) 2 - - - ( 1 )
Calculate the difference (y of neighbor distance elements reference data m, m=2,3 ..., 2M), namely
y m=x m-x m-1,m=2,3,…,2M(2)
Step 2 carries out the judgement of reference data homogeneity according to neighboring reference data difference, determines clutter edge position
Find out the maximal value of the absolute value of all 2M-1 difference, be labeled as p by under the difference of correspondence, namely
p = arg m a x m ( | y m | ) , m = 2 , 3 , ... , 2 M - - - ( 3 )
In above formula, || represent and take absolute value, the value of m when representing that in bracket, parameter value is maximum;
(1) if maximum absolute difference y pmeet
| y p | < 2 &beta; &sigma; ^ - - - ( 4 )
Then show to there is not clutter edge in 2M reference data, perform step 3; In formula (4), β is regulation coefficient, can set according to actual conditions, generally need meet β >=1;
If y pdo not meet formula (4), then with subscript p for boundary, be divided into two groups of (x with reference to data m, m=1 ..., p-1 and x k, k=p ..., 2M), and respectively data consistency judgement is carried out to two groups of reference datas according to following step (2);
(2) data consistency judges
For the data sequence (z that length is L l, l=1 ..., L), consistency check statistic V lcan be expressed as
V L = L &Sigma; l = 1 L z l 2 / ( &Sigma; l = 1 L z l ) 2 - - - ( 5 )
In general, V lless, represent that the consistance of data sequence is better;
If V lmeet
V l<H l(6) then decision data sequence has consistance; Otherwise decision data sequence is inconsistent; In formula (6), H lrepresent consistance decision threshold, can set according to data sequence length L;
(3) if two groups of reference data (x m, m=1 ..., p-1 and x k, k=p ..., 2M) all there is consistance, then show that 2M reference data can be divided into uniform two groups, perform step 3, otherwise perform step (4);
(4) to all difference data (y m, m=2,3 ..., 2M) divide into groups by numerical value is positive and negative, will with y pthe difference data set that symbol is contrary is designated as S p, namely
S p={y m|y my p<0,m=2,3,…,2M}(7)
Find out data acquisition S pthe difference of middle maximum absolute value, is labeled as q by under the difference of correspondence, namely
q = arg m a x m ( | y m | ) , y m &Element; S p - - - ( 8 )
Smaller in p and q and the greater are designated as a and b respectively, namely
a=min(p,q)(9)
b=max(p,q)(10)
In formula, min () and max () represents respectively and gets minimum value and maximal value; With subscript a and subscript b for boundary, 2M reference data can be divided into uniform three groups, perform step 3;
Step 3, according to the homogeneity court verdict of step 2, determines the homogeneous reference data of screening
The homogeneous reference data of all N number of screenings are designated as X m, m=1 ..., N;
If judgement 2M reference data (x in step (1) m, m=1,2 ..., 2M) in there is not clutter edge, then all 2M reference data is considered to uniform, setting
N=2M(11)
X m=x m(12)
Otherwise, if judgement 2M reference data is with subscript p for boundary is divided into uniform two groups in step (3), then determine the homogeneous reference data of screening according to the magnitude relationship of p and M;
If
M<p-1(13)
Then by front p-1 reference data (x m, m=1 ..., p-1) and as the homogeneous reference data of screening, namely
N=p-1(14)
X m=x m(15)
Otherwise the result according to step (4) determines the homogeneous reference data of screening;
If
M<a-1(16)
Then by front a-1 reference data (x m, m=1 ..., a-1) and as the homogeneous reference data of screening, namely
N=a-1(17)
X m=x m(18)
Otherwise, if
M≥b(19)
Then by rear 2M-b+1 reference data (x m, m=b ..., 2M) and as the homogeneous reference data of screening, namely
N=2M-b+1(20)
X m=x m+b-1(21)
Otherwise, by the b-a of centre reference data (x m, m=a ..., b-1) and as the homogeneous reference data of screening, namely
N=b-a(22)
X m=x m+a-1(23)
Step 4 forms detection threshold according to the homogeneous reference data fusion of screening, and carries out adjudicating with presence or absence of target
Utilize the homogeneous reference data (X of screening m, m=1 ..., N) merge, estimate the assorted mean intensity Z that makes an uproar, namely
Z = &Sigma; m = 1 N X m / N - - - ( 24 )
Detection threshold T is set, namely based on the assorted strength estimations Z that makes an uproar
In T=α Z (25) above formula, threshold factor α sets according to default false-alarm probability, guarantees the CFAR characteristic of detection method;
Finally, treat detecting distance unit observed reading x and carry out detection judgement, if x >=T, then judge that unit to be detected exists target; Otherwise, if x<T, then judge that unit to be detected does not exist target.
3. beneficial effect
Compared with background technology, the invention has the beneficial effects as follows: slide the complex clutter environment of window before and after clutter edge may appear at simultaneously under, the inventive method can be adaptive to the change of clutter edge number and position, filter out the homogeneous reference data that can represent range unit clutter background to be detected, improve the false alarm control capability of radar in complex clutter peripheral surroundings and target detection performance, avoid circulation or iterative operation, calculated amount is little, is convenient to Project Realization.
Accompanying drawing explanation
Accompanying drawing is the functional block diagram of detection method proposed by the invention.In figure, 1. variance and difference calculating module, 2. clutter edge position estimation module, 3. reference data screening module, 4. detection threshold computing module, 5. detects judging module.
Embodiment
Below in conjunction with accompanying drawing, the invention will be further described.The embodiment of the present invention is used for explaining and the present invention is described, instead of limits the invention, and in the protection domain of spirit of the present invention and claim, any amendment make the present invention and change, all fall into protection scope of the present invention.
With reference to Figure of description, the specific embodiment of the present invention is divided into following step:
Step 1 obtains reference data, carries out variance evaluation, calculates neighboring reference data difference
The signal that radar antenna receives through amplifying and after mixing, carrying out energy accumulation, then obtain radar return vision signal through wave detector by matched filter, and carries out sampling by Range resolution unit and obtains radar return video observing measured value; Centered by range unit observed reading x to be detected, before and after it, get the radar return video observing measured value of M range unit respectively, form 2M reference data (x m, m=1,2 ..., 2M), and send into variance and difference calculating module (1), the variance of all 2M reference data is estimated according to formula (1) and the difference (y of neighbor distance elements reference data is calculated according to formula (2) m, m=2,3 ..., 2M);
Step 2 is by the variance of step 1 gained with difference data (y m, m=2,3,, 2M) and send into clutter edge position estimation module (2), carry out the judgement of reference data homogeneity according to neighboring reference data difference, determine clutter edge position, specifically adopt following steps (1) to step (4);
(1) the maximal value y of the absolute value of difference data is obtained according to formula (3) p, according to the regulation coefficient β in actual conditions setting formula (4), if y pmeet the condition of formula (4), then perform step 3;
(2) if y pdo not meet the condition of formula (4), then with subscript p for boundary, be divided into two groups of (x with reference to data m, m=1 ..., p-1 and x k, k=p ..., 2M), and respectively data consistency judgement is carried out to two groups of reference datas according to formula (5) and formula (6);
(3) if two groups of reference data (x m, m=1 ..., p-1 and x k, k=p ..., 2M) all there is consistance, then perform step 3, otherwise perform step (4);
(4) obtain and y according to formula (7) pthe difference data S set that symbol is contrary p, find out S set according to formula (8) pthe subscript q that the difference of middle maximum absolute value is corresponding, according to formula (9) and formula (10) acquisition subscript a and subscript b, and performs step 3;
The homogeneity court verdict of step 2 is sent into reference data screening module (3) by step 3, filters out N number of homogeneous reference data (X according to formula (11) to formula (23) m, m=1 ..., N);
Homogeneous reference data (the X that step 3 is screened by step 4 m, m=1 ..., N) and send into detection threshold computing module (4), merge according to formula (24) and form the assorted mean intensity estimation Z that makes an uproar, and according to formula (25) setting detection threshold T; Finally detection threshold T is sent into and detect judging module (5), detection judgement is carried out and output detections result, if x>=T to data to be tested x, then judge that unit to be detected exists target, otherwise, if x<T, then judge that unit to be detected does not exist target.

Claims (1)

1. radar target self-adaptation fusion detection method under complex clutter peripheral surroundings, is characterized in that comprising the following steps:
Step 1 obtains reference data, carries out variance evaluation, calculates neighboring reference data difference
Centered by range unit observed reading x to be detected, before and after it, get the radar return video observing measured value of M range unit respectively, form 2M reference data x m, wherein m=1,2 ..., 2M;
Calculate the estimate of variance of all 2M reference data namely
&sigma; ^ 2 = 1 2 M &Sigma; m = 1 2 M ( x m - 1 2 M &Sigma; k = 1 2 M x k ) 2 - - - ( 1 )
Calculate the difference y of neighbor distance elements reference data m, wherein m=2,3 ..., 2M, namely
y m=x m-x m-1,m=2,3,…,2M(2)
Step 2 carries out the judgement of reference data homogeneity according to neighboring reference data difference, determines clutter edge position
Find out the maximal value of the absolute value of all 2M-1 difference, be labeled as p by under the difference of correspondence, namely
p = arg m a x m ( | y m | ) , m = 2 , 3 , ... , 2 M - - - ( 3 )
In above formula, || represent and take absolute value, the value of m when representing that in bracket, parameter value is maximum;
(1) if maximum absolute difference y pmeet
| y p | < 2 &beta; &sigma; ^ - - - ( 4 )
Then show to there is not clutter edge in 2M reference data, perform step 3; In formula (4), β is regulation coefficient, can set according to actual conditions, need meet β >=1;
If y pdo not meet formula (4), then with subscript p for boundary, be divided into x with reference to data mand x ktwo groups, wherein m=1 ..., p-1 and k=p ..., 2M, and respectively data consistency judgement is carried out to two groups of reference datas according to following step (2);
(2) data consistency judges
Be the data sequence z of L for length l, wherein l=1 ..., L, consistency check statistic V lcan be expressed as
V L = L &Sigma; l = 1 L z l 2 / ( &Sigma; l = 1 L z l ) 2 - - - ( 5 )
Wherein, V lless, represent that the consistance of data sequence is better;
If V lmeet
V L<H L(6)
Then decision data sequence has consistance; Otherwise decision data sequence is inconsistent; In formula (6), H lrepresent consistance decision threshold, can set according to data sequence length L;
(3) if two groups of reference data x mand x kall there is consistance, wherein m=1 ..., p-1 and k=p ..., 2M, then show that 2M reference data can be divided into uniform two groups, performs step 3, otherwise perform step (4);
(4) to all difference data y mdivide into groups by numerical value is positive and negative, wherein m=2,3 ..., 2M, and will with y pthe difference data set that symbol is contrary is designated as S p, namely
S p={y m|y my p<0,m=2,3,…,2M}(7)
Find out data acquisition S pthe difference of middle maximum absolute value, is labeled as q by under the difference of correspondence, namely
q = arg m a x m ( y m | ) , y m &Element; S p - - - ( 8 )
Smaller in p and q and the greater are designated as a and b respectively, namely
a=min(p,q)(9)
b=max(p,q)(10)
In formula, min () and max () represents respectively and gets minimum value and maximal value; With subscript a and subscript b for boundary, 2M reference data can be divided into uniform three groups, perform step 3;
Step 3, according to the homogeneity court verdict of step 2, determines the homogeneous reference data of screening
The homogeneous reference data of all N number of screenings are designated as X m, m=1 ..., N;
If judgement 2M reference data x in step (1) min there is not clutter edge, wherein m=1,2 ..., 2M, then all 2M reference data is considered to uniform, setting
N=2M(11)
X m=x m(12)
Otherwise, if judgement 2M reference data is with subscript p for boundary is divided into uniform two groups in step (3), then determine the homogeneous reference data of screening according to the magnitude relationship of p and M;
If
M<p-1(13)
Then by a front p-1 reference data x mas the homogeneous reference data of screening, wherein m=1 ..., p-1, namely
N=p-1(14)
X m=x m(15)
Otherwise the result according to step (4) determines the homogeneous reference data of screening;
If
M<a-1(16)
Then by a front a-1 reference data x mas the homogeneous reference data of screening, wherein m=1 ..., a-1, namely
N=a-1(17)
X m=x m(18)
Otherwise, if
M≥b(19)
Then by a rear 2M-b+1 reference data x mas the homogeneous reference data of screening, wherein m=b ..., 2M, namely
N=2M-b+1(20)
X m=x m+b-1(21)
Otherwise, by the b-a of a centre reference data x mas the homogeneous reference data of screening, wherein m=a ..., b-1, namely
N=b-a(22)
X m=x m+a-1(23)
Step 4 forms detection threshold according to the homogeneous reference data fusion of screening, and carries out adjudicating with presence or absence of target
Utilize the homogeneous reference data X of screening mmerge, wherein m=1 ..., N, estimates the assorted strength estimations Z that makes an uproar, namely
Z = &Sigma; m = 1 N X m / N - - - ( 24 )
Detection threshold T is set, namely based on the assorted strength estimations Z that makes an uproar
T=αZ(25)
In above formula, threshold factor α sets according to default false-alarm probability, guarantees the CFAR characteristic of detection method;
Finally, treat detecting distance unit observed reading x and carry out detection judgement, if x >=T, then judge that unit to be detected exists target; Otherwise, if x<T, then judge that unit to be detected does not exist target.
CN201410248860.7A 2014-06-06 2014-06-06 Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings Active CN103995258B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410248860.7A CN103995258B (en) 2014-06-06 2014-06-06 Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410248860.7A CN103995258B (en) 2014-06-06 2014-06-06 Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings

Publications (2)

Publication Number Publication Date
CN103995258A CN103995258A (en) 2014-08-20
CN103995258B true CN103995258B (en) 2016-04-13

Family

ID=51309473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410248860.7A Active CN103995258B (en) 2014-06-06 2014-06-06 Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings

Country Status (1)

Country Link
CN (1) CN103995258B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104729664B (en) * 2015-03-02 2018-03-30 北方工业大学 Optical fiber vibration detection method and device
CN105182312B (en) * 2015-09-29 2017-09-12 大连楼兰科技股份有限公司 The CFAR detection method of adaptive environment change
CN106093903B (en) * 2016-06-17 2018-06-15 电子科技大学 Multiple target CFAR detection method based on unilateral detection unit cumulative mean
CN106872958B (en) * 2017-04-27 2019-04-12 中国人民解放军海军航空大学 Radar target self-adapting detecting method based on linear fusion
CN107765228B (en) * 2017-09-29 2019-11-01 西安电子科技大学 A kind of online radar target detection method based on region similitude
CN109856637B (en) * 2017-11-30 2021-09-03 比亚迪股份有限公司 Automobile and constant false alarm based automobile radar target detection method and device
CN108919225B (en) * 2018-07-26 2020-06-30 中国人民解放军海军航空大学 Distance extension target multichannel fusion detection method under partial uniform environment
CN109633597A (en) * 2019-01-23 2019-04-16 广州辰创科技发展有限公司 A kind of variable mean value sliding window CFAR detection algorithm and storage medium
CN113296070A (en) * 2020-02-24 2021-08-24 光宝科技股份有限公司 Arithmetic device for object detection and object detection method
CN111273249B (en) * 2020-03-04 2022-07-08 清华大学 Intelligent clutter partition method based on radar false alarm preprocessing time
CN112965040B (en) * 2021-02-05 2024-01-23 重庆邮电大学 Self-adaptive CFAR target detection method based on background pre-screening

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09145829A (en) * 1995-11-28 1997-06-06 Mitsubishi Electric Corp Radar signal processing unit
CN101271160B (en) * 2007-03-21 2011-05-11 中国科学院电子学研究所 Method and device for real-time detection of SAR movement objective by choosing small unit average constant false alarm rate
CA2774377C (en) * 2012-02-02 2017-05-02 Raytheon Canada Limited Knowledge aided detector
CN103076602B (en) * 2012-12-27 2016-06-01 中国人民解放军海军航空工程学院 For the radar self-adaption constant false alarm rate fusion detection method of multiple goal background
CN103149555B (en) * 2013-01-25 2014-10-29 西安电子科技大学 Self-adaptive moving target detection method capable of combining polarized classification and power grouping

Also Published As

Publication number Publication date
CN103995258A (en) 2014-08-20

Similar Documents

Publication Publication Date Title
CN103995258B (en) Radar target self-adaptation fusion detection method under complex clutter peripheral surroundings
CN103076602B (en) For the radar self-adaption constant false alarm rate fusion detection method of multiple goal background
CN103544296B (en) Adaptive intelligent integration detection method of radar range extension target
CN104502899B (en) A kind of adaptive constant false alarm rate object detection method
CN103995259B (en) Radar target adaptive-filtering fusion detection method under intensive interference environment
CN103760542A (en) MMVI-CFAR target detection method
CN103760543B (en) A kind of based on multimodal CFAR object detection method
CN107861107B (en) Double-threshold CFAR (computational fluid dynamics) and trace point agglomeration method suitable for continuous wave radar
CN101329400B (en) Constant false alarm detection method of radar target based on goodness-of-fit test
CN104360324B (en) A kind of clutter map partition method based on image procossing
CN101975940A (en) Segmentation combination-based adaptive constant false alarm rate target detection method for SAR image
CN104391290A (en) CFAR detector suitable for complex inhomogeneous clutters
CN103217673B (en) CFAR detecting method under inhomogeneous Weibull clutter background
US9261585B2 (en) Radar apparatus using image change detector and method of operating the same
CN105044686B (en) Radar dense false target interference inhibition method
CN105842687A (en) Detection tracking integrated method based on RCS prediction information
CN105372636A (en) Adaptive forgetting factor-based clutter image update method and apparatus
CN109521412B (en) Radar networking airspace target detection method based on local statistic fusion
CN104237861A (en) Method for obtaining CFAR detection threshold in unknown clutter background
CN104035084B (en) Tracking before a kind of dynamic programming for non-homogeneous clutter background detects
Lu et al. Data-dependent clustering-CFAR detector in heterogeneous environment
CN107102293B (en) The passive co-located method of unknown clutter based on the estimation of sliding window integral density
CN110208767A (en) A kind of radar target rapid detection method based on fitting correlation coefficient
CN103576131A (en) Method for intelligently fusing and detecting distance extension targets based on intrinsic modal characteristic energy
CN108718223A (en) A kind of blind frequency spectrum sensing method of non-co-operation signal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200612

Address after: 264001 Research and Academic Department, 188 Erma Road, Zhifu District, Yantai City, Shandong Province

Patentee after: NAVAL AERONAUTICAL University

Address before: 264001 No. two, 188 Road, Zhifu District, Shandong, Yantai

Patentee before: NAVAL AERONAUTICAL AND ASTRONAUTICAL University PLA