CN112327265A - Division and treatment detection method based on semantic segmentation network - Google Patents

Division and treatment detection method based on semantic segmentation network Download PDF

Info

Publication number
CN112327265A
CN112327265A CN202011147125.9A CN202011147125A CN112327265A CN 112327265 A CN112327265 A CN 112327265A CN 202011147125 A CN202011147125 A CN 202011147125A CN 112327265 A CN112327265 A CN 112327265A
Authority
CN
China
Prior art keywords
clutter
detection
unit
image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011147125.9A
Other languages
Chinese (zh)
Inventor
胡程
王锐
周超
李思伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202011147125.9A priority Critical patent/CN112327265A/en
Publication of CN112327265A publication Critical patent/CN112327265A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a division and treatment detection algorithm based on a semantic segmentation network. The invention realizes the clutter region segmentation of the range-Doppler domain by means of a semantic segmentation network, sets a detection strategy according to the characteristics of main bodies and edges of different clutter regions, selects a proper detector and reasonable detection parameters from an alternative detector group to complete divide-and-conquer detection, and effectively inhibits false alarms while detecting a target to the maximum extent. The technology has important significance for radar detection of low-altitude small targets. Compared with a detection plane normalization method, the method does not need detection plane conversion, has smaller calculated amount, avoids the influence of a parameter estimation result on the detection plane conversion, and has stronger robustness.

Description

Division and treatment detection method based on semantic segmentation network
Technical Field
The invention belongs to the technical field of low-altitude small target radar detection, and particularly relates to a division and treatment detection method based on a semantic segmentation network.
Background
The low-altitude small target is a flying target with the flying height below 1000 meters and the radar reflection sectional area less than 2 square meters. The low-altitude small target is closely related to human production life, according to relevant data statistics, biological targets such as birds and the like are main causes of bird attack events, and artificial targets such as unmanned aerial vehicles and the like are main tools of terrorist attacks. The radar is a powerful tool for detecting the low-altitude small target, and has important significance for warning in advance and improving public safety.
In the radar detection of the low-altitude small target, due to the unique motion characteristic of the target, the target is very easily influenced by various clutter such as ground vegetation, weather cloud and rain and the like, and different types of clutter present different clutter areas in a range Doppler domain. The traditional target detection method adopts a single detector to carry out Chinese Constant False Alarm detection (CFAR) in a range-Doppler domain, cannot give consideration to the detection performance of different clutter areas, and is easy to generate False alarms and missing detection. The current processing idea is to identify clutter areas and adopt a differentiation processing strategy, after identifying clutter areas, the idea of distribution indexing is mostly adopted, namely, a detection plane containing different clutter areas is normalized into index distribution of the same parameter, and then target detection is carried out, but the method needs to convert each value of the detection plane, has huge calculation amount and depends heavily on a parameter estimation result, and cannot form a uniform detection plane when the parameter estimation is inaccurate, so that the detection performance is deteriorated rapidly.
Disclosure of Invention
In view of this, the invention provides a divide-and-conquer detection method based on a semantic segmentation network, which can adopt different detectors to detect different clutter distribution areas, thereby improving detection accuracy.
A divide and conquer detection method based on semantic segmentation network processes the echo signal of radar to form image; cutting the image, and identifying the cut image through a semantic segmentation network to obtain an identification result, wherein the identification result comprises a range-Doppler domain clutter-free area, a rain clutter area and a ground clutter area; and then, different detectors and corresponding retrieval strategies are adopted for detecting Doppler domain clutter-free areas and rain clutter.
Preferably, the method for processing the echo signal of the radar to form the image includes:
performing pulse compression and coherent accumulation on an echo signal of the radar to obtain a range-Doppler domain data matrix; then converting the range-Doppler domain data matrix into an energy ratio matrix; and converting the obtained energy ratio matrix into picture pixel values with the same size in a linear mapping mode to finally form an image.
Preferably, the semantic segmentation network performs neural network learning before use, and the specific method is as follows:
and labeling each pixel in the image by combining program pre-labeling and manual adjustment, and performing neural network learning by taking a labeling result and the image as input objects of a semantic segmentation network.
Preferably, the search policy is:
(1) the unit to be detected is located in the ground clutter region
Target detection is not performed because the ground clutter region is a non-detection region;
(2) the unit to be detected is located in a clutter-free area
When the unit to be detected is positioned in the clutter-free area, a CA-CFAR detector is adopted;
(3) the unit to be detected is located in the rain clutter area
When the unit to be detected is located in the rain clutter area, a CA-CFAR detector is adopted;
and after the detector is selected, setting detection parameters according to the detectors selected in different clutter areas, and completing target detection.
Preferably, the search strategy further considers that when non-homogeneous units are mixed:
(1) the unit to be detected is located in a clutter-free area
When the reference unit is mixed into a non-homogeneous unit, namely, other clutter region interference exists at the edge of the reference unit, an SO-CFAR detector is adopted;
(2) the unit to be detected is located in the rain clutter area
When the reference cell half-window is mixed with a non-homogeneous cell of the clutter-free zone, a GO-CFAR detector is adopted;
when non-homogeneous units are mixed in other forms, GO-CFAR detectors are used.
Has the advantages that:
1. the clutter region segmentation of a range-Doppler domain is realized by means of a semantic segmentation network, a detection strategy is formulated according to the characteristics of main bodies and edges of different clutter regions, a proper detector and reasonable detection parameters are selected from an alternative detector group to complete divide-and-conquer detection, and false alarms are effectively suppressed while a target is detected to the maximum extent. The technology has important significance for radar detection of low-altitude small targets. Compared with a detection plane normalization method, the method does not need detection plane conversion, has smaller calculated amount, avoids the influence of a parameter estimation result on the detection plane conversion, and has stronger robustness.
Drawings
Fig. 1 is an overall algorithm flow.
FIG. 2 is a mapping of energy ratio to image pixel.
FIG. 3(a) shows the flow of the pre-labeling procedure.
FIG. 3(b) is a diagram illustrating the recognition result of the pre-annotation process.
FIG. 4 is a diagram of a divide and conquer detector.
Fig. 5(a) is a view of the actual measurement data collection scenario 1.
Fig. 5(b) is a view of the actual measurement data collection scenario 2.
Fig. 6(a) is a schematic diagram of the range-doppler domain of the rain clutter.
Fig. 6(b) is a schematic diagram of the ground clutter range-doppler domain.
Fig. 7(a) is a schematic diagram of data picture input.
Fig. 7(b) is a data picture label schematic.
Fig. 8(a) shows a clutter recognition result — a picture input diagram.
Fig. 8(b) is a schematic diagram of the clutter recognition result-network output.
Fig. 8(c) is a clutter recognition result-image label.
FIG. 9(a) is the result of the partition test method- -CA-CFAR example 1
FIG. 9(b) shows the result of the partition and measure method- -GO-CFAR example 1
FIG. 9(c) shows the result of the divide and conquer test method- -SO-CFAR example 1
FIG. 9(d) shows the result of the divide and conquer method- -example of clutter recognition result 1
FIG. 9(e) shows the results of the divide-and-conquer test method- -divide-and-conquer test example 1
FIG. 10(a) is the result of the partition test method- -CA-CFAR example 2
FIG. 10(b) shows the result of the partition and measure method- -GO-CFAR example 2
FIG. 10(c) is the result of the divide and conquer test method- -SO-CFAR example 2
FIG. 10(d) shows the result of the divide-and-conquer method- -example of clutter recognition result 2
FIG. 10(e) shows the results of the divide-and-conquer test method- -divide-and-conquer test example 2
Detailed Description
The invention is described in detail below with reference to the accompanying drawings and two embodiments.
The invention provides a division and treatment detection method based on a semantic segmentation network. The method is based on the basic idea that clutter region division of a range-Doppler domain is achieved through a semantic segmentation network, and then appropriate detectors and detection parameters are selected from an alternative detector group according to attributes of a detection unit and a reference unit to conduct target detection, so that detection performance under a complex clutter environment is improved. The whole algorithm flow is shown in fig. 1.
Step one, performing pulse compression and coherent accumulation on an echo signal of the radar to obtain a range-Doppler domain data matrix. On the basis, the range-doppler domain data matrix is converted into an energy ratio matrix according to equation (1).
Figure BDA0002740063380000051
Wherein p isi,jIs the (i, j) th element, h, of the energy ratio matrix Pi,jIs the (i, j) th element of the range-doppler domain data matrix H, and N represents the number of elements of the matrix.
And step two, mapping the energy ratio matrix in the step one into an image pixel value according to the formula (2).
Figure BDA0002740063380000052
Wherein the content of the first and second substances,pixeli,jrepresenting the (i, j) -th element pixel value, floor (·) representing the rounding operation, SminAnd representing a mapping truncation threshold, k represents a mapping slope, and then converting the obtained energy ratio matrix into picture pixel values with the same size in a linear mapping mode, wherein the upper limit of the obtained image pixel values is 255, and the lower limit of the obtained image pixel values is 0. The specific mapping relationship is shown in fig. 2. The specific expression is shown as formula (3).
Figure BDA0002740063380000053
Wherein S ismaxRepresenting the maximum in the energy ratio matrix.
And step three, cutting the image obtained in the step two according to the used semantic segmentation network requirement and hardware performance to obtain an image slice. For example, the deep v3+ network requires that the width and height of the input image be greater than 320 pixels.
And step four, dividing the clutter region into a Doppler region clutter-free region, a rain clutter region and a ground clutter region according to the clutter attributes. Inputting the image slices obtained in the third step into a semantic segmentation network to obtain the recognition results of the network on the range-Doppler domain clutter-free area, the rain clutter area and the ground clutter area;
at the initial use, neural network training (BP learning) is required for the semantic segmentation network. BP learning requires providing labels corresponding to the entries. The tag contains clutter class information corresponding to the input. The characteristics of different clutters need to be considered when marking the clutter classes. For the ground clutter, the distribution is mainly in a central Doppler channel, and the strength relation along the distance direction and the extension degree in the Doppler direction are related to a scene; for rain clutter, as raindrops are filled in the whole radar wave beam, the distance direction and the Doppler direction of a rain clutter area are expanded in different degrees, a block-shaped clutter area is formed, and the area is large. The clutter-free area also presents block distribution, but the intensity is obviously lower than that of the rain clutter area because of no clutter energy interference. And when the picture is marked, the marking precision can be improved by referring to the prior information.
In order to improve the labeling precision and the labeling efficiency, the labels are obtained by combining program pre-labeling and manual adjustment. The program is pre-labeled as a series of pixel fill, pixel culling, and dilation erosion processes, and the specific method flow is shown in fig. 3(a) and 3 (b). The program is redundant, and the parameter adaptability and the method adaptability are poor, so that the labeling and classifying process cannot be independently completed. Therefore, manual adjustment is performed after program pre-labeling, and mainly relates to whether the clutter class is correct, calibrating the edge profile of the clutter and the like.
Fifthly, dividing and detecting by combining the original range-Doppler domain data matrix and the clutter identification result in the fourth step; the structure of the divide and conquer detector is shown in fig. 4, and the specific detection strategy is as follows:
the characteristics of the three distinguished areas are different, and the distribution range of the rain clutter area and the clutter-free area is large and is an area where a target can exist in a large probability; the ground clutter area is caused by a strong ground reflection object, the energy is high, targets falling into the area are often submerged by clutter, and the targets are not easy to detect. Therefore, the ground clutter region is set as a non-detection region, and the non-clutter region and the rain clutter region are set as regions to be detected. The research and actual measurement of predecessors show that the amplitudes of a clutter-free area (background noise) and a rain clutter area can be approximately described by Rayleigh distribution, so that an average CFAR detector is selected for target detection.
Because the areas with different attributes are mainly distributed in a block shape, clutter edges are mixed into non-homogeneous sampling from one side more during target detection, meanwhile, the target distribution is sparse, and the multi-target phenomenon is not obvious. Therefore, in order to combine the advantages of different detectors, different detection strategies are set according to the clutter recognition result and the attributes of the detection unit and the reference unit, and the detection strategies are as follows:
(1) the unit to be detected is located in the ground clutter region
Since the ground clutter region is a non-detection region, no target detection is performed regardless of the attribute of the reference cell.
(2) The unit to be detected is located in a clutter-free area
When the reference unit is not mixed with non-homogeneous units, a CA-CFAR detector is employed.
When the reference unit is mixed with the heterogeneous unit, the edge of the reference unit has the characteristics different from the image of the local clutter area, and other clutter area interferences exist.
(3) The unit to be detected is located in the rain clutter area
When the reference unit is not mixed with non-homogeneous units, a CA-CFAR detector is employed.
When the reference cell half-window is mixed with non-homogeneous cells in the clutter free zone, a GO-CFAR detector is used to counter false alarms generated by clutter edges due to weak mixed cell energy.
When non-homogeneous units are mixed in other forms, the GO-CFAR detector is used uniformly, as it is rare that no more detailed differentiation is made here.
After the detector is selected, differential or non-differential detection parameters are set for different clutter areas (as the case may be), and target detection is completed. Namely: and setting detection parameters corresponding to the detectors according to the detectors selected by different clutter areas and experience to finish target detection.
Example (b):
in this example, the effectiveness of the algorithm is verified using the measured clutter data. The actual measurement clutter data is ground clutter and rain clutter data, and a specific experimental scene is shown in fig. 5(a) and 5 (b). The radar parameters associated with the algorithm are shown in table 1.
TABLE 1 Radar parameters
Figure BDA0002740063380000081
The range-doppler domain obtained after pulse compression and coherent accumulation of the typical rain clutter and ground clutter echoes is shown in fig. 6(a) and 6 (b).
The division and treatment detection method based on the semantic segmentation network is adopted to complete the method verification under the measured data, and the specific flow is as follows:
step one, pulse compression and 512-frame coherent accumulation are carried out on the echo of the actually measured clutter data. And (4) converting the obtained range-Doppler domain data matrix into pixel values of the image according to the formulas (4) to (6), labeling the image by using an image morphology method, and manually and repeatedly correcting to obtain a more accurate labeling result. The image is then cropped by cropping to 512 pixel wide and 512 pixel high image slices as a network trained dataset. Fig. 7(a) and 7(b) give examples of data sets.
Step two, the ratio of 7: 3, dividing the data set in the step one into a training set and a testing set, selecting DeepLab v3+ developed by Google by a semantic segmentation network, carrying out training of the network by using the training set, adopting a fine-tuning idea for training, and loading pre-training weight of the network under the VOC2012 data set except that a logits layer for carrying out category judgment needs to be retrained. Specific training parameters are shown in the following table.
TABLE 2 network training parameters
Figure BDA0002740063380000082
Figure BDA0002740063380000091
The learning rate is calculated as shown in equation (7).
Figure BDA0002740063380000092
Wherein current _ lr represents the current learning rate, base _ lr represents the initial learning rate, global _ step represents the current training number, training _ step represents the total training number, and learning _ power represents the attenuation coefficient.
And step three, inputting the test set sample in the data set into a neural network to obtain a clutter recognition result. Fig. 8(a), 8(b) and 8(c) show a set of recognition results.
Step four, according to the characteristics of different clutters, the division detection is carried out on the main body and the edge of different clutter areas, and the detection strategy is as follows:
(1) the unit to be detected is located in the ground clutter region
Since the ground clutter region is a non-detection region, no target detection is performed regardless of the attribute of the reference cell.
(2) The unit to be detected is located in a clutter-free area
When the reference unit is not mixed with non-homogeneous units, a CA-CFAR detector is employed.
When the reference cell is mixed with a non-homogeneous cell, an SO-CFAR detector is used to counter the target shadowing effect caused by clutter edges due to the strong energy of the heterogeneous cell.
(3) The unit to be detected is located in the rain clutter area
When the reference unit is not mixed with non-homogeneous units, a CA-CFAR detector is employed.
When the reference cell half-window is mixed with non-homogeneous cells in the clutter free zone, a GO-CFAR detector is used to counter false alarms generated by clutter edges due to weak mixed cell energy.
When non-homogeneous units are mixed in other forms, the GO-CFAR detector is used uniformly, as it is rare that no more detailed differentiation is made here.
The specific detection parameters are as follows: two-stage threshold detection is adopted, wherein a fixed threshold detection with a first-stage threshold of 10dB is adopted, two-dimensional CFAR detection is adopted for points exceeding the fixed threshold, and the number of reference units in any direction of the two-dimensional CFAR detection is 48. The preset false alarm rate of the clutter-free area is 10-6In order to suppress the false alarm caused by tailing in the rain clutter area, the false alarm rate of the rain clutter area is properly increased to 10-8
Fig. 9(a) - (e) and fig. 10(a) - (e) show examples of detection results, and it can be seen that the detection result obtained by the method can effectively suppress false alarms caused by ground clutter and rain clutter, and has better detection performance for targets located in a clutter edge environment.
And (4) counting the performances of the method and the CFAR detection methods of all the mean values on the test set. The evaluation results are shown in table 3.
TABLE 3 comparison of the Performance of the test methods
Figure BDA0002740063380000101
On the premise of basically not losing the target, the divide-and-conquer detection method can effectively inhibit false alarms and has higher detection performance. The validity of the division and treatment detection method based on the semantic segmentation network can be seen through the verification of the actually measured clutter data. The method can improve the detection performance of the low-altitude small target. .
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. A divide and conquer detection method based on semantic segmentation network is characterized in that an echo signal of a radar is processed to form an image; cutting the image, and identifying the cut image through a semantic segmentation network to obtain an identification result, wherein the identification result comprises a range-Doppler domain clutter-free area, a rain clutter area and a ground clutter area; and then, different detectors and corresponding retrieval strategies are adopted for detecting Doppler domain clutter-free areas and rain clutter.
2. The divide-and-conquer detecting method according to claim 1, wherein the echo signal of the radar is processed to form an image by:
performing pulse compression and coherent accumulation on an echo signal of the radar to obtain a range-Doppler domain data matrix; then converting the range-Doppler domain data matrix into an energy ratio matrix; and converting the obtained energy ratio matrix into picture pixel values with the same size in a linear mapping mode to finally form an image.
3. The divide and conquer detection method of claim 1, wherein the semantic segmentation network performs neural network learning before use, and the specific method is as follows:
and labeling each pixel in the image by combining program pre-labeling and manual adjustment, and performing neural network learning by taking a labeling result and the image as input objects of a semantic segmentation network.
4. The divide and conquer detection method of claim 2 or 3, wherein the search strategy is:
(1) the unit to be detected is located in the ground clutter area, and target detection is not carried out;
(2) the unit to be detected is located in a clutter-free area, and a CA-CFAR detector is adopted for target detection;
(3) and the unit to be detected is positioned in the rain clutter area, and a CA-CFAR detector is adopted to detect the target.
5. The divide and conquer detection method of claim 4, wherein the search strategy further considers, when non-homogeneous units are mixed:
(1) the unit to be detected is located in a clutter-free area, and when the reference unit is mixed with the non-homogeneous unit, the SO-CFAR detector is adopted for detection;
(2) the unit to be detected is located in a rain clutter area, and when a reference unit half window is mixed with a non-homogeneous unit without a clutter area, a GO-CFAR detector is adopted for detection; when non-homogeneous units are mixed in other forms, detection is performed using a GO-CFAR detector.
CN202011147125.9A 2020-10-23 2020-10-23 Division and treatment detection method based on semantic segmentation network Pending CN112327265A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011147125.9A CN112327265A (en) 2020-10-23 2020-10-23 Division and treatment detection method based on semantic segmentation network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011147125.9A CN112327265A (en) 2020-10-23 2020-10-23 Division and treatment detection method based on semantic segmentation network

Publications (1)

Publication Number Publication Date
CN112327265A true CN112327265A (en) 2021-02-05

Family

ID=74312177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011147125.9A Pending CN112327265A (en) 2020-10-23 2020-10-23 Division and treatment detection method based on semantic segmentation network

Country Status (1)

Country Link
CN (1) CN112327265A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113688682A (en) * 2021-07-23 2021-11-23 北京理工雷科电子信息技术有限公司 Clutter identification and target detection method based on improved FCN (fuzzy C-means) deep network
CN114089301A (en) * 2021-11-05 2022-02-25 哈尔滨工程大学 Novel adaptive sonar target detection method based on neural network and computer equipment
CN115616577A (en) * 2022-12-19 2023-01-17 广东大湾区空天信息研究院 Environment self-adaptive vehicle-mounted millimeter wave radar detection method and device and related equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009074839A (en) * 2007-09-19 2009-04-09 Nec Corp Clutter discrimination method and radar apparatus
US20090096662A1 (en) * 2006-11-02 2009-04-16 Jian Wang Moving Target Detector for Radar Systems
CN101872014A (en) * 2010-06-18 2010-10-27 深圳麒景雷信科技有限公司 Target signal detection method based on improved COSGO (Average Order Statistics Greatest of)-CFAR (Constant False Alarm Rate)
CN102141610A (en) * 2010-12-23 2011-08-03 哈尔滨工业大学 Range-Doppler spectrum-based ionized layer clutter region identification method
CN103197298A (en) * 2013-03-21 2013-07-10 西安电子科技大学 Radar signal processing method based on environmental information
CN103760542A (en) * 2014-01-10 2014-04-30 杭州电子科技大学 MMVI-CFAR target detection method
CN107993215A (en) * 2017-11-27 2018-05-04 象辑知源(武汉)科技有限公司 A kind of weather radar image processing method and system
CN108615238A (en) * 2018-05-08 2018-10-02 重庆邮电大学 Sea Clutter from HF Radar method for extracting region based on region segmentation
CN108829826A (en) * 2018-06-14 2018-11-16 清华大学深圳研究生院 A kind of image search method based on deep learning and semantic segmentation
CN109543589A (en) * 2018-11-16 2019-03-29 西安电子科技大学 Extra large land Scene Segmentation based on the constant distance of first phase-Doppler and KNN
CN110298271A (en) * 2019-06-17 2019-10-01 上海大学 Seawater method for detecting area based on critical point detection network and space constraint mixed model
CN110689037A (en) * 2018-07-06 2020-01-14 塔塔咨询服务有限公司 Method and system for automatic object annotation using deep networks
CN110837078A (en) * 2018-08-16 2020-02-25 国家海洋局第一海洋研究所 Target detection method under array ground wave radar sea clutter background based on correlation characteristics

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090096662A1 (en) * 2006-11-02 2009-04-16 Jian Wang Moving Target Detector for Radar Systems
JP2009074839A (en) * 2007-09-19 2009-04-09 Nec Corp Clutter discrimination method and radar apparatus
CN101872014A (en) * 2010-06-18 2010-10-27 深圳麒景雷信科技有限公司 Target signal detection method based on improved COSGO (Average Order Statistics Greatest of)-CFAR (Constant False Alarm Rate)
CN102141610A (en) * 2010-12-23 2011-08-03 哈尔滨工业大学 Range-Doppler spectrum-based ionized layer clutter region identification method
CN103197298A (en) * 2013-03-21 2013-07-10 西安电子科技大学 Radar signal processing method based on environmental information
CN103760542A (en) * 2014-01-10 2014-04-30 杭州电子科技大学 MMVI-CFAR target detection method
CN107993215A (en) * 2017-11-27 2018-05-04 象辑知源(武汉)科技有限公司 A kind of weather radar image processing method and system
CN108615238A (en) * 2018-05-08 2018-10-02 重庆邮电大学 Sea Clutter from HF Radar method for extracting region based on region segmentation
CN108829826A (en) * 2018-06-14 2018-11-16 清华大学深圳研究生院 A kind of image search method based on deep learning and semantic segmentation
CN110689037A (en) * 2018-07-06 2020-01-14 塔塔咨询服务有限公司 Method and system for automatic object annotation using deep networks
CN110837078A (en) * 2018-08-16 2020-02-25 国家海洋局第一海洋研究所 Target detection method under array ground wave radar sea clutter background based on correlation characteristics
CN109543589A (en) * 2018-11-16 2019-03-29 西安电子科技大学 Extra large land Scene Segmentation based on the constant distance of first phase-Doppler and KNN
CN110298271A (en) * 2019-06-17 2019-10-01 上海大学 Seawater method for detecting area based on critical point detection network and space constraint mixed model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周超 等: "Ku波段实验雷达海杂波实测数据分析", 信号处理, vol. 31, no. 12, pages 1573 - 1578 *
张成峰 等: "基于杂波背景分割二维恒虚警检测算法研究", 中国电子科学研究院学报, vol. 13, no. 02, pages 156 - 159 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113688682A (en) * 2021-07-23 2021-11-23 北京理工雷科电子信息技术有限公司 Clutter identification and target detection method based on improved FCN (fuzzy C-means) deep network
CN114089301A (en) * 2021-11-05 2022-02-25 哈尔滨工程大学 Novel adaptive sonar target detection method based on neural network and computer equipment
CN115616577A (en) * 2022-12-19 2023-01-17 广东大湾区空天信息研究院 Environment self-adaptive vehicle-mounted millimeter wave radar detection method and device and related equipment

Similar Documents

Publication Publication Date Title
Haeffelin et al. Evaluation of mixing-height retrievals from automatic profiling lidars and ceilometers in view of future integrated networks in Europe
CN112327265A (en) Division and treatment detection method based on semantic segmentation network
CN107861107B (en) Double-threshold CFAR (computational fluid dynamics) and trace point agglomeration method suitable for continuous wave radar
Lo et al. Fractal characterisation of sea-scattered signals and detection of sea-surface targets
CN103760543B (en) A kind of based on multimodal CFAR object detection method
CN104457626B (en) A kind of plant leaf area assessment of indices method based on laser radar point cloud
Stumpf et al. The National Severe Storms Laboratory mesocyclone detection algorithm for the WSR-88D
CN110261857B (en) Spatial interpolation method for weather radar
KR20150066315A (en) Quantitative precipitation estimation system based dual polarization radars and method thereof
CN109324328B (en) Method and device for extracting wind profile radar vertical beam turbulence spectrum in rainfall
Holleman et al. Quality assessment of weather radar wind profiles during bird migration
Yin et al. Object-orientated filter design in spectral domain for polarimetric weather radar
Zou et al. A method of radar echo extrapolation based on TREC and Barnes filter
Peter et al. Application of a Bayesian classifier of anomalous propagation to single-polarization radar reflectivity data
Sinha et al. Estimation of Doppler profile using multiparameter cost function method
Williams et al. Cluster analysis techniques to separate air motion and hydrometeors in vertical incident profiler observations
CN113096122A (en) Meteor detection method and device and electronic equipment
CN108985292A (en) A kind of SAR image CFAR object detection method and system based on multi-scale division
Kida et al. Improvement of rain/no-rain classification methods for microwave radiometer observations over the ocean using a 37 GHz emission signature
CN116879899A (en) Method based on aerial precipitation particle spectrum inversion
Yang et al. Automatic identification of clear-air echoes based on millimeter-wave cloud radar measurements
Chen et al. Jensen–Shannon Distance-Based Filter and Unsupervised Evaluation Metrics for Polarimetric Weather Radar Processing
Kingfield et al. An evaluation of tornado intensity using velocity and strength attributes from the WSR-88D mesocyclone detection algorithm
Zhao et al. Identification and Removal of Ground Clutter Using the Fuzzy Logic Algorithm
Helmert et al. An operational tool to quality control 2D radar reflectivity data for assimilation in COSMO-DE

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination