CN112163454A - High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement - Google Patents

High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement Download PDF

Info

Publication number
CN112163454A
CN112163454A CN202010874204.3A CN202010874204A CN112163454A CN 112163454 A CN112163454 A CN 112163454A CN 202010874204 A CN202010874204 A CN 202010874204A CN 112163454 A CN112163454 A CN 112163454A
Authority
CN
China
Prior art keywords
clutter
network
spectrum
image
wave radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010874204.3A
Other languages
Chinese (zh)
Inventor
张玲
李庆丰
牛炯
黎明
纪永刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ocean University of China
Original Assignee
Ocean University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ocean University of China filed Critical Ocean University of China
Priority to CN202010874204.3A priority Critical patent/CN112163454A/en
Publication of CN112163454A publication Critical patent/CN112163454A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Abstract

The invention discloses an RD spectrum enhancement based intelligent classification and positioning method for clutter of a high-frequency ground wave radar, and belongs to the technical field of high-frequency ground wave radar detection. The method comprises the following implementation steps: the method comprises the steps of completing classification tasks on ImageNet by utilizing a CNN network, updating and storing network parameters, constructing a radar target detection network based on a fast R-CNN, enhancing data, training the fast R-CNN network and testing, mapping target positions output by the network to actual physical positions of RD spectrums, formulating corresponding evaluation indexes aiming at different clutters and evaluating test results.

Description

High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement
Technical Field
The invention provides an RD spectrum enhancement-based intelligent classification and positioning method for clutter of a high-frequency ground wave radar, and belongs to the technical field of high-frequency ground wave radar detection.
Background
The high-frequency ground wave radar plays an important role in offshore detection and monitoring due to the over-the-horizon characteristic, but the detection capability of the high-frequency ground wave radar is severely restricted by multi-clutter interference such as sea clutter, ionospheric clutter, radio frequency interference and the like. In order to effectively eliminate the influence of the clutter, accurate identification and positioning of the clutter and interference are crucial, and the high-frequency ground wave radar can detect the ship target out of sight by analyzing the propagation characteristics of the high-frequency electromagnetic wave by benefiting from the high conductivity of the high-frequency wave band vertical polarization electromagnetic wave. Strong clutter and interference, such as first-order sea clutter, ionospheric clutter, radio frequency interference, etc., may be mixed into the electromagnetic waves received by the high-frequency ground wave radar, which severely limits the detection capability of HFSWR. Theoretical and simulation results show that the signal-to-noise ratio of the target is also obviously reduced after clutter or interference suppression is carried out in the traditional clutter suppression method. Therefore, in order to better preserve signal energy and improve the processing efficiency of the ground wave radar, it is important to determine whether or not there is a clutter or interference and its location before starting suppression.
The prior art has the following defects: the manual feature extraction needs to select a proper feature extraction method, the features have a lot of uncertainty, and the extracted features are not the most proper features; the intelligent feature extraction method needs a large number of training samples, and is easy to cause an overfitting phenomenon under the condition of insufficient data, so that the generalization capability of the network is extremely poor; the backbone network is small, so that the characteristic loss is caused, and a large amount of original image information can be lost; small networks tend to shrink the picture or grow the convolution kernel and stride large, leaving many key features missing.
Disclosure of Invention
The invention discloses an RD spectrum enhancement-based intelligent classification and positioning method for clutter of a high-frequency ground wave radar, and aims to solve the problems that in the prior art, manually extracted features have a lot of uncertainty, deep learning needs a lot of training samples, and valuable information is lost due to the fact that an original high-resolution RD spectrum is greatly reduced.
The intelligent classification and positioning method of the clutter of the high-frequency ground wave radar based on the RD spectrum enhancement comprises the following implementation steps:
s1, completing a classification task on ImageNet by utilizing a CNN network;
s2, updating and storing network parameters;
s3, constructing a radar target detection network based on fast R-CNN;
s4, enhancing data;
s5, training and testing the fast R-CNN network;
s6, mapping the target position output by the network to the actual physical position of the RD spectrum;
s7, corresponding evaluation indexes are formulated according to different clutter;
and S8, evaluating the test result.
In step S1, the dimensionality of the CNN full-link layer is set according to the number of categories of the classified data sets in ImageNet, and the network parameters are updated according to the cross entropy loss function.
In step S3, the 14 × 14 feature map output by the backbone network is used as input of a region proposed network RPN and a region of interest ROI pooling layer, the RPN proposes candidate regions of interest, the ROI pooling layer collects the input feature map and the RPN proposal, and extracts the feature map corresponding to the proposed region after integrating information; setting a full connection layer, namely a classification layer, of the network into four dimensions, including a background category; the regression layer is set to be four-dimensional, represents the position information of the target frame, sends the feature map of the proposed area to the subsequent full-connection layer to judge the target category, and regresses the position.
In step S4, when the training data set is expanded and the enhanced sample is a valid sample, the corresponding data enhancement methods include the following five methods:
s4.1, removing useless information in the RD spectrum by adopting an edge cutting method for the boundary information of the image;
s4.2, for smaller targets, cutting the image by taking a clutter as a center according to the size of network input by using a center cutting method, and segmenting the image into different smaller images;
s4.3, compressing the image by using PCA and wavelet transformation, and simultaneously keeping the main characteristics of the image;
s4.4, combining edge cutting and graying;
and S4.5, horizontally turning to transform the image space and reserving the pixel information of the original image.
In step S5, an Adam optimizer is used to update the weights of the layers, and the batch size is set to 2; setting alpha as alpha in exponential decreasing00.9epochWherein epoch is the current training period, the initial learning rate α00.001; the total training cycle number is set to 40, IOU is 0.6, and the image size is adjusted to the size of network input 224 × 224; randomly dividing the measured data into 80% of training set and 20% of verification set, amplifying the training set by using a data enhancement method, and training a Faster R-CNN network; and testing by using the test set.
In step S6, performing two discrete fourier transforms on the radar echo signal, performing a discrete fourier transform in the horizontal axis direction, and acquiring distance information; performing discrete Fourier transform in the direction of a vertical axis to obtain Doppler frequency information; the distance range of the RD spectrum is [ d ]min,dmax]The unit is km; frequency range of [ -f, f]In Hz; initially the RD spectrum size was rxc; clutter coordinates predicted by Faster R-CNN are [ x, y, w, h]The actual clutter position is derived according to the following equation:
Figure BDA0002652100480000021
in the formula (I), the compound is shown in the specification,
Figure BDA0002652100480000022
and
Figure BDA0002652100480000023
is the actual location of the clutter,/cAnd lrThe horizontal and vertical edge frames for generating the RD spectrum are respectively, and the image values of the edge-cropped image and the original image are respectively zero and non-zero constants.
In step S7, the sea clutter and the ionospheric clutter are evaluated using AP, mAP, and RP curves, and a plurality of positive examples are divided into positive examples using a Recall ratio Recall ═ TP/(TP + FN) metric; judging the proportion of positive examples in the examples divided into positive examples by adopting accuracy ratio TP/(TP + FP), wherein TP represents that positive classes are predicted to be positive numbers, TN represents that negative classes are predicted to be negative numbers, FP represents that negative classes are predicted to be positive numbers, and FN represents that positive classes are predicted to be negative numbers;
Figure BDA0002652100480000031
Figure BDA0002652100480000032
where p is the accuracy, r is the recall, and the mAP, a function of p being r, is the average of the APs; formulating RRecall as an RFI evaluation index, judging whether RFI exists on an RD spectrum, and calculating according to the formula:
Figure BDA0002652100480000033
in the formula, ctpIndicating correctly predicted RFI, ctnIndicating no RFI on correctly predicted RD spectra, cfpIndicating prediction of other targets as RFI, cfnIndicating that RFI is predicted as other targets.
Compared with the prior art, the invention has the beneficial effects that: the fast R-CNN is selected as a backbone network for HFSWR target detection, so that the detection steps are extremely simplified, and the RD spectrum is input into the network to be output, namely the final detection result; the end-to-end structure enables the network to extract valid features; the structure of shared convolution enables the detection time to be greatly shortened and the detection accuracy to be far better than that of the prior method.
Drawings
FIG. 1 is an overall flow chart of high-frequency ground wave radar clutter detection based on RD spectrum enhancement;
FIG. 2 is a graph of sea clutter RP;
fig. 3 is a diagram of ionospheric clutter RP.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments below:
1. parameter setting
The overall flow chart of the high-frequency ground wave radar clutter detection based on the RD spectrum enhancement is shown in FIG. 1, and Googlenet, Resnet50 and Resnet101 are used as network backbones of clutter classification and are implemented on a GTX1080Ti GPU. And updating the weights of all layers by adopting an Adam optimizer, and increasing momentum to accelerate the convergence speed of the weights. The batch size was set to 2 due to the smaller number of samples; to determine the learning rate, an exponential decreasing method is used, i.e. α ═ α00.9epochInitial learning rate α00.001, where epoch is the current training period. In this embodiment, the total number of training cycles is set to 40, IOU is set to 0.6, and non-maximum suppression is performed to select the best target box. Randomly selecting 80% of samples as a training set and 20% as a test set, expanding the training set by using a data enhancement method, and resizing an image to the size of 224 x 224 of the input of the network.
2. Target detection result
The results of clutter/interference classification of the three clutter classes using the Faster R-CNN, YOLO and SH-SVM detection framework are shown in Table 1.
TABLE 1 results of multiple clutter detection
Figure BDA0002652100480000041
The bold face indicates the best detection result for each type. Because the original data set is small and the overfitting phenomenon is easily caused, a comparison experiment is carried out on the two enhanced data sets under different detection frames. Data1 is obtained by gray scale, boundary clipping and horizontal flipping; data2 is enhanced by adding wavelet transform, PCA image compression, and center cropping on top of Data 1. The data set for the SH-SVM is the RD spectrum of the clipped patches with a ratio of 1:3 for positive and negative samples. In order to reflect the detection performance, the AP value and the mep value are used to evaluate the recognition effect on various clutter and the overall performance, and a number of positive examples using the Recall ═ TP/(TP + FN) metric are divided into positive examples. The ratio of the positive example in the example divided into the positive examples is judged by Precision (TP/(TP + FP), wherein the TP tableThe positive class is predicted as the positive class number, the negative class is predicted as the negative class number by TN, the negative class is predicted as the positive class number by FP, and the positive class is predicted as the negative class number by FN. The calculation formula is as follows:
Figure BDA0002652100480000042
where p is precision, r is recall, p is a function of r,
Figure BDA0002652100480000043
is the average value of the APs.
As can be seen from the test results, the mAP reaches 87.78%. For a single target, the sea clutter and ionospheric clutter reach 96.87% and 91.02%, respectively. In terms of detection time, the target detection framework of YOLO is superior to that of Faster R-CNN, but accuracy is sacrificed, and the detection framework of Faster R-CNN achieves real-time detection speed. Since RFI often appears in a large area on the RD spectrum, RRecll is used as an evaluation index of RFI to judge whether RFI exists on the RD spectrum, and the calculation formula is as follows:
Figure BDA0002652100480000051
wherein c istpIs a correct prediction of RFI in the RD spectrum, ctnIs a correct prediction of RFI not in the RD spectrum, cfpIs a misprediction of RFI in RD spectra, cfnIs a false prediction of RFI that is not in the RD spectrum. Comparing the transmission network trained by Data2 with other methods as shown in table 2, it can be seen that the method provided by the present invention can achieve nearly 100% detection accuracy.
TABLE 2 RFI test results
Figure BDA0002652100480000052
The area where the sea clutter and the ionospheric clutter usually appear is not full screen, so the judgment of the position and the size is more important, and in order to adjust the position of the prediction frame to locate the clutter and the interference, the AP is used for evaluating the identification and classification effect of the target frame. The RP graph of sea clutter proposed by the present invention is compared with the RP graph of ionospheric clutter of the prior art, such as shown in fig. 2 and 3, and it can be seen that the accuracy of the RP curve obtained by the present invention can still be maintained at a level close to 100% with the increase of recall rate.
In the detection result of the SH-SVM, the size of the target cannot be self-adaptive, and the phenomenon of misjudgment can also occur when some target frames are too large or too small. The three clutters are respectively detected by using the network judged before the suppression, and a lot of misjudgments can occur when the target is detected. Compared with the SH-SVM detection result, the YOLO detection frame has stronger self-adaptive capacity, but the detection confidence coefficient is lower, namely the probability that certain clutter exists in the target frame is lower. The test result of the method provided by the invention under the fast R-CNN framework shows that the classification precision of the method for the clutter is nearly 100%, a proper frame can be selected in a self-adaptive manner according to the size of the clutter or interference, the actual distance range of the clutter is displayed, and weak ionospheric clutter and sea clutter can be detected.
The method provided by the invention can effectively extract the characteristics of various clutters by learning a large number of samples. Under the condition that an artificial threshold is not required to be set, clutter types can be well distinguished, and the position of the clutter can be accurately positioned. Even under strong RFI conditions, the method can still identify the type and location of other clutter.
As shown in table 3, the efficiency of the method proposed by the present invention is higher in terms of detection time. In the aspects of detection time and detection precision, the method achieves the optimal detection precision, the calculation efficiency is 4 times that of the traditional detection method, the running time is only 0.97s, and the real-time processing requirement of the HFSWR clutter detection is met.
TABLE 3 comparison of the detection times by the three methods
Method of producing a composite material Clutter morphology analysis Threshold segmentation The method of the invention
Time (each RD spectrum) 1.23 1.02 0.97
It is to be understood that the above description is not intended to limit the present invention, and the present invention is not limited to the above examples, and those skilled in the art may make modifications, alterations, additions or substitutions within the spirit and scope of the present invention.

Claims (6)

1. The intelligent classification and positioning method of the clutter of the high-frequency ground wave radar based on the RD spectrum enhancement is characterized by comprising the following implementation steps:
s1, completing a classification task on ImageNet by utilizing a CNN network;
s2, updating and storing network parameters;
s3, constructing a radar target detection network based on fast R-CNN;
s4, enhancing data;
s5, training and testing the fast R-CNN network;
s6, mapping the target position output by the network to the actual physical position of the RD spectrum;
s7, corresponding evaluation indexes are formulated according to different clutter;
and S8, evaluating the test result.
2. The RD spectrum enhancement based intelligent classification and positioning method for clutter of high-frequency ground wave radar of claim 1, wherein in step S1, the dimensionality of the CNN full-link layer is set according to the number of classes of the classified data sets in ImageNet, and the network parameters are updated according to the cross entropy loss function; in step S3, the 14 × 14 feature map output by the backbone network is used as input of a region proposed network RPN and a region of interest ROI pooling layer, the RPN proposes candidate regions of interest, the ROI pooling layer collects the input feature map and the RPN proposal, and extracts the feature map corresponding to the proposed region after integrating information; setting a full connection layer, namely a classification layer, of the network into four dimensions, including a background category; the regression layer is set to be four-dimensional, represents the position information of the target frame, sends the feature map of the proposed area to the subsequent full-connection layer to judge the target category, and regresses the position.
3. The intelligent classification and location method for high-frequency ground wave radar clutter based on RD spectral enhancement as claimed in claim 1, wherein in step S4, the training data set is expanded, and when the enhanced samples are valid samples, the corresponding data enhancement methods include five methods:
s4.1, removing useless information in the RD spectrum by adopting an edge cutting method for the boundary information of the image;
s4.2, for smaller targets, cutting the image by taking a clutter as a center according to the size of network input by using a center cutting method, and segmenting the image into different smaller images;
s4.3, compressing the image by using PCA and wavelet transformation, and simultaneously keeping the main characteristics of the image;
s4.4, combining edge cutting and graying;
and S4.5, horizontally turning to transform the image space and reserving the pixel information of the original image.
4. The RD spectrum enhancement based intelligent classification and positioning method for high-frequency ground wave radar clutter according to claim 1, wherein in step S5, the Adam optimizer is adopted to update the weights of each layer, and the batch size is set to 2; setting alpha as alpha in exponential decreasing00.9epochWherein epoch is the current training period, the initial learning rate α00.001; the total training cycle number is set to 40, IOU is 0.6, and the image size is adjusted to the size of network input 224 × 224; the measured data is processedDividing the machine into 80% of training set and 20% of verification set, amplifying the training set by using a data enhancement method, and training the Faster R-CNN network; and testing by using the test set.
5. The RD spectrum enhancement based intelligent classification and location method for high-frequency ground wave radar clutter according to claim 1, wherein in step S6, the radar echo signal is subjected to two discrete Fourier transforms, and the discrete Fourier transform is performed in the horizontal axis direction to obtain distance information; performing discrete Fourier transform in the direction of a vertical axis to obtain Doppler frequency information; the distance range of the RD spectrum is [ d ]min,dmax]The unit is km; frequency range of [ -f, f]In Hz; initially the RD spectrum size was rxc; clutter coordinates predicted by Faster R-CNN are [ x, y, w, h]The actual clutter position is derived according to the following equation:
Figure FDA0002652100470000021
in the formula (I), the compound is shown in the specification,
Figure FDA0002652100470000022
and
Figure FDA0002652100470000023
is the actual location of the clutter,/cAnd lrThe horizontal and vertical edge frames for generating the RD spectrum are respectively, and the image values of the edge-cropped image and the original image are respectively zero and non-zero constants.
6. The intelligent classification and location method for high-frequency ground wave radar clutter based on RD spectral enhancement according to claim 1, wherein in step S7, sea clutter and ionospheric clutter are evaluated using AP, mAP and RP curves, and there are multiple positive examples classified as positive examples using the Recall ratio Recall-TP/(TP + FN) metric; judging the proportion of positive examples in the examples divided into positive examples by adopting accuracy ratio TP/(TP + FP), wherein TP represents that positive classes are predicted to be positive numbers, TN represents that negative classes are predicted to be negative numbers, FP represents that negative classes are predicted to be positive numbers, and FN represents that positive classes are predicted to be negative numbers;
Figure FDA0002652100470000024
Figure FDA0002652100470000025
where p is the accuracy, r is the recall, and the mAP, a function of p being r, is the average of the APs; formulating RRecall as an RFI evaluation index, judging whether RFI exists on an RD spectrum, and calculating according to the formula:
Figure FDA0002652100470000026
in the formula, ctpIndicating correctly predicted RFI, ctnIndicating no RFI on correctly predicted RD spectra, cfpIndicating prediction of other targets as RFI, cfnIndicating that RFI is predicted as other targets.
CN202010874204.3A 2020-08-27 2020-08-27 High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement Pending CN112163454A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010874204.3A CN112163454A (en) 2020-08-27 2020-08-27 High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010874204.3A CN112163454A (en) 2020-08-27 2020-08-27 High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement

Publications (1)

Publication Number Publication Date
CN112163454A true CN112163454A (en) 2021-01-01

Family

ID=73860321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010874204.3A Pending CN112163454A (en) 2020-08-27 2020-08-27 High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement

Country Status (1)

Country Link
CN (1) CN112163454A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113505697A (en) * 2021-07-09 2021-10-15 哈尔滨工业大学 High-frequency ground wave radar clutter classification method based on U-Net network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369018A (en) * 2007-08-17 2009-02-18 电子科技大学 Satellite machine combined double-base synthetic aperture radar frequency domain imaging method
CN101975948A (en) * 2010-10-28 2011-02-16 电子科技大学 Imaging method for remote sensing satellite irradiation source forward-looking synthetic aperture radar
CN103207387A (en) * 2013-03-26 2013-07-17 北京理工雷科电子信息技术有限公司 Method for quickly simulating airborne phased array pulse Doppler (PD) radar clutter
CN104749570A (en) * 2015-04-01 2015-07-01 电子科技大学 Shift invariant airborne bistatic synthetic aperture radar target positioning method
CN105738887A (en) * 2016-01-29 2016-07-06 西安电子科技大学 Airborne radar clutter power spectrum optimization method based on Doppler channel division
CN106093870A (en) * 2016-05-30 2016-11-09 西安电子科技大学 The SAR GMTI clutter suppression method of hypersonic aircraft descending branch
CN106226745A (en) * 2016-08-04 2016-12-14 武汉大学 A kind of external illuminators-based radar clutter suppression method based on sub-band processing and device
CN106707247A (en) * 2017-03-24 2017-05-24 武汉大学 High-frequency ocean radar target detection method based on compact antenna array
CN108196240A (en) * 2018-02-07 2018-06-22 中国人民解放军国防科技大学 Ground moving target track reconstruction method suitable for CSAR imaging
CN110210463A (en) * 2019-07-03 2019-09-06 中国人民解放军海军航空大学 Radar target image detecting method based on Precise ROI-Faster R-CNN

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369018A (en) * 2007-08-17 2009-02-18 电子科技大学 Satellite machine combined double-base synthetic aperture radar frequency domain imaging method
CN101975948A (en) * 2010-10-28 2011-02-16 电子科技大学 Imaging method for remote sensing satellite irradiation source forward-looking synthetic aperture radar
CN103207387A (en) * 2013-03-26 2013-07-17 北京理工雷科电子信息技术有限公司 Method for quickly simulating airborne phased array pulse Doppler (PD) radar clutter
CN104749570A (en) * 2015-04-01 2015-07-01 电子科技大学 Shift invariant airborne bistatic synthetic aperture radar target positioning method
CN105738887A (en) * 2016-01-29 2016-07-06 西安电子科技大学 Airborne radar clutter power spectrum optimization method based on Doppler channel division
CN106093870A (en) * 2016-05-30 2016-11-09 西安电子科技大学 The SAR GMTI clutter suppression method of hypersonic aircraft descending branch
CN106226745A (en) * 2016-08-04 2016-12-14 武汉大学 A kind of external illuminators-based radar clutter suppression method based on sub-band processing and device
CN106707247A (en) * 2017-03-24 2017-05-24 武汉大学 High-frequency ocean radar target detection method based on compact antenna array
CN108196240A (en) * 2018-02-07 2018-06-22 中国人民解放军国防科技大学 Ground moving target track reconstruction method suitable for CSAR imaging
CN110210463A (en) * 2019-07-03 2019-09-06 中国人民解放军海军航空大学 Radar target image detecting method based on Precise ROI-Faster R-CNN

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王祎鸣等: "基于AIS距离-多普雷投影的地波雷达CFAR检测验证方法", 《海洋学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113505697A (en) * 2021-07-09 2021-10-15 哈尔滨工业大学 High-frequency ground wave radar clutter classification method based on U-Net network
CN113505697B (en) * 2021-07-09 2022-07-29 哈尔滨工业大学 High-frequency ground wave radar clutter classification method based on U-Net network

Similar Documents

Publication Publication Date Title
CN113689428B (en) Mechanical part stress corrosion detection method and system based on image processing
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN107808138B (en) Communication signal identification method based on FasterR-CNN
CN107861107A (en) A kind of double threshold CFAR suitable for continuous wave radar and Plot coherence method
CN107507209B (en) Printogram extraction method of polarized SAR image
CN107742113B (en) One kind being based on the posterior SAR image complex target detection method of destination number
CN107563397A (en) Cloud cluster method for calculation motion vector in a kind of satellite cloud picture
CN112541481A (en) Sea detection radar target detection method based on deep learning
CN109726649A (en) Remote sensing image cloud detection method of optic, system and electronic equipment
CN111611907A (en) Image-enhanced infrared target detection method
CN115661649B (en) BP neural network-based shipborne microwave radar image oil spill detection method and system
CN116402825B (en) Bearing fault infrared diagnosis method, system, electronic equipment and storage medium
CN107274410A (en) Adaptive man-made target constant false alarm rate detection method
CN104680536A (en) Method for detecting SAR image change by utilizing improved non-local average algorithm
Zou et al. A method of radar echo extrapolation based on TREC and Barnes filter
CN112163454A (en) High-frequency ground wave radar clutter intelligent classification and positioning method based on RD spectrum enhancement
CN102081799A (en) Method for detecting change of SAR images based on neighborhood similarity and double-window filtering
CN106709501B (en) Scene matching area selection and reference image optimization method of image matching system
CN108986083B (en) SAR image change detection method based on threshold optimization
CN111368653B (en) Low-altitude small target detection method based on R-D graph and deep neural network
CN112329677A (en) Remote sensing image river target detection method and device based on feature fusion
CN110310263B (en) SAR image residential area detection method based on significance analysis and background prior
CN107729903A (en) SAR image object detection method based on area probability statistics and significance analysis
CN116778341A (en) Multi-view feature extraction and identification method for radar image
CN114240940B (en) Cloud and cloud shadow detection method and device based on remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210101

RJ01 Rejection of invention patent application after publication