CN114114261A - C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and equipment - Google Patents

C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and equipment Download PDF

Info

Publication number
CN114114261A
CN114114261A CN202111417460.0A CN202111417460A CN114114261A CN 114114261 A CN114114261 A CN 114114261A CN 202111417460 A CN202111417460 A CN 202111417460A CN 114114261 A CN114114261 A CN 114114261A
Authority
CN
China
Prior art keywords
clutter
sar
unet
slices
clutter suppression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111417460.0A
Other languages
Chinese (zh)
Other versions
CN114114261B (en
Inventor
张云
化青龙
王军
冀振元
姜义成
徐丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202111417460.0A priority Critical patent/CN114114261B/en
Publication of CN114114261A publication Critical patent/CN114114261A/en
Application granted granted Critical
Publication of CN114114261B publication Critical patent/CN114114261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A self-supervision SAR sea clutter suppression method, a system, a storage medium and equipment based on C2C belong to the technical field of radar image processing. The method aims to solve the problem that SAR clutter cannot be effectively inhibited by using the existing deep learning technology due to lack of real data without clutter in reality. Firstly, cutting a to-be-suppressed wide scene SAR to form a test slice set; randomly selecting N clutter slices in a clutter slice set for CV-UNet + + training, and subtracting the test slice S from the N clutter slices according to a C2C strategy to obtain the clutter slice set
Figure DDA0003375714880000011
And inputting the N clutter suppressed slices into CV-UNet + + to obtain N slices TiGet it
Figure DDA0003375714880000012
To test the final clutter suppression effect map of the slice S. The method is mainly used for inhibiting the SAR sea clutter.

Description

C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and equipment
Technical Field
The invention relates to an SAR sea clutter suppression method, and belongs to the technical field of radar image processing.
Background
Synthetic Aperture Radar (SAR) technology has the advantages of being undissolvable in the field of ocean monitoring, and has wide application prospects in detection of sea surface targets, especially ship targets, by using the SAR technology. However, sea clutter caused by strong reflections from the ocean surface is a typical problem for SAR image interpretation and target detection of interest. The problem can cause the reduction of the SAR image quality and influence the performance of various application technologies of SAR remote sensing, such as ship target detection and classification. Several methods have been proposed to suppress sea clutter in SAR images, including multi-view processing, smoothing, adaptive spatial domain filters, etc. However, manually selecting appropriate parameters for conventional algorithms is not easy and it is difficult to balance between preserving different image features and removing clutter.
To address the limitations of conventional approaches, some deep learning based approaches have been developed. Most of the methods adopt a convolutional neural network-based architecture for training, SAR image clutter suppression is converted into a regression task, and a clean clutter-free image is output in an end-to-end mode. The algorithm needs to carry out supervision training on paired clutter-free images and clutter-containing images, and the biggest problem is lack of clutter-free real data.
Disclosure of Invention
The method aims to solve the problem that SAR clutter cannot be effectively inhibited by using the existing deep learning technology due to lack of real data without clutter in reality.
The self-supervision SAR sea clutter suppression method based on C2C comprises the following steps:
firstly, cutting a wide scene SAR to be suppressed to form a test slice set;
each test slice S is then processed: randomly selecting N clutter slices (C) in clutter slice set for CV-UNet + + training1,C2,...,CN) And the test slice S and the N clutter slices (C) are processed according to the C2C strategy1,C2,...,CN) Are subtracted to obtain
Figure BDA0003375714860000011
Will be provided with
Figure BDA0003375714860000012
Inputting the data into trained CV-UNet + + to obtain N slices (T) with clutter suppressed1,T2,...,TN) Get it
Figure BDA0003375714860000013
A final clutter suppression effect map of the test slice S is obtained;
the training process of CV-UNet + + comprises the following steps:
s1, establishing a sample library:
dividing an SAR image for training into two areas, wherein the area where an interested target is located is a target area, and the area where a background clutter is located is a clutter area;
randomly selecting a pixel point in a target area in the SAR image as a central point, and cutting the pixel point into W multiplied by H target slices to form a target slice set;
randomly cutting clutter slices with the same size in the clutter area to form a clutter slice set, wherein the clutter slices cannot be overlapped with the target area;
strategy S2, C2C:
randomly selecting a target slice S in the target slice set, and then randomly selecting two clutter slices C in the clutter slice set1And C2And constructing a complex domain SAR image pair
Figure BDA0003375714860000021
S3, taking CV-UNet + + as clutter suppression model phiθTraining clutter using complex field SAR image pairsInhibition model phiθAnd obtaining the trained CV-UNet + + network.
Further, in the process of cutting the wide scene SAR to be suppressed, the wide scene SAR is cut according to the size of 128 × 128 which is W × H.
Furthermore, when the wide-width scene SAR to be suppressed is cut, the test slices are not overlapped.
Further, the training clutter suppression model phiθUsing a mean square error loss function to optimize phi in the processθAs shown below
Figure BDA0003375714860000022
Wherein θ represents φθThe parameter (c) of (c).
The self-supervision SAR sea clutter suppression system based on C2C is used for executing the self-supervision SAR sea clutter suppression method based on C2C.
A storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the C2C-based self-supervised SAR sea clutter suppression method.
An apparatus comprising a processor and a memory, the memory having stored therein at least one instruction that is loaded and executed by the processor to implement the C2C-based method for self-supervised SAR sea clutter suppression. The invention has the beneficial effects that:
(a) the invention innovatively provides an SAR sea clutter suppression framework based on an automatic supervision training strategy C2C. The frame does not need a clean clutter-free SAR image or a repeated observation SAR image from the same scene, so that a good clutter suppression effect can be realized, and the defect that a supervised learning needs a clean clutter-free SAR image is overcome.
(b) The invention innovatively provides a complex number domain deep learning network CV-UNet + +, uses the CV-UNet + + for sea clutter suppression, fully utilizes the amplitude and phase information of the SAR image, and has better clutter suppression performance compared with a real number domain UNet + + network.
(c) The invention utilizes the deep learning technology, obviously improves the effect of inhibiting the SAR sea clutter, better retains the complete characteristics of the interested target and is beneficial to the subsequent SAR image processing process.
Drawings
Fig. 1 is a flow chart of SAR sea clutter suppression based on an unsupervised training strategy C2C;
FIG. 2 is a diagram of the CV-UNet + + architecture;
FIG. 3 is a schematic diagram of a training area test area selection, wherein R0 is a training area and R1-R2 are test areas;
fig. 4 shows the clutter suppression result. Wherein (a1) is the original image of the test region R1, and (b1) - (c1) are clutter suppression results of UNet + + and CV-UNet + + on R1; (a2) is the original image of the test region R2, (b2) - (c2) are clutter suppression results on R1 for UNet + + and CV-UNet + +.
Detailed Description
The first embodiment is as follows: the present embodiment is described in connection with figure 1,
in order to solve the limitation of the traditional method and the deep learning method, the invention provides a new self-supervision training strategy C2C-based SAR sea clutter suppression method, which does not need a label only containing a target without any clutter and does not need repeated observation of SAR images from the same scene; meanwhile, in order to fully utilize the amplitude and phase information of the SAR image, the method uses the C2C strategy to train the CV-UNet + + network model for sea clutter suppression, has a good clutter suppression effect, and can well retain the complete information of the interested target.
The self-supervision SAR sea clutter suppression method based on C2C described in the embodiment comprises the following steps:
s1, establishing a sample library
Firstly, an SAR image is divided into two areas, the area where an interested target is located is a target area, and the area where a background clutter is located is a clutter area.
And randomly selecting a pixel point in a target area in the SAR image as a central point, and cutting a target slice with the size of 128 multiplied by 128 to form a target slice set.
And randomly cutting clutter slices with the same size in the clutter area to form a clutter slice set, wherein the clutter slices cannot be overlapped with the target area.
It should be noted that the target slice includes both target and background clutter, while the clutter slice includes only background clutter.
S2, C2C strategies
Randomly selecting a target slice S in the target slice set, and then randomly selecting two clutter slices C in the clutter slice set1And C2And constructing a complex domain SAR image pair
Figure BDA0003375714860000031
S3, in order to fully utilize the amplitude and phase information of the SAR image, designing CV-UNet + + as a clutter suppression model phiθ. The input of CV-UNet + + is
Figure BDA0003375714860000041
The output is the slice after clutter suppression.
The CV-UNet + + network is shown in FIG. 2, in which CV-UNet + + extends UNet + + to complex fields, and the convolution layer and activation function in CV-UNet + + belong to complex fields.
Training clutter suppression network phiθThe training does not need to include only the target without any clutter tags, and the tags can be the clutter and target coexistence condition. Optimizing phi using Mean Square Error (MSE) loss function during trainingθAs shown below
Figure BDA0003375714860000042
Wherein θ represents φθThe parameter (c) of (c).
The forward propagation process of the CV-UNet + + convolution layer is as follows: output a of the previous layerl-1Weight matrix W with the first convolutional layerlConvolving and applying an offset blTo obtain zl,zlActivating a function via complex fields
Figure BDA0003375714860000043
Activating to obtain the output a of the convolutional layerlThe calculation process can be expressed as the following formula
Figure BDA0003375714860000044
Wherein the complex field activation function
Figure BDA0003375714860000045
The expression is respectively activated for the real part and the imaginary part of the complex number; σ (-) represents the real number domain activation function, the invention uses leak-ReLU; symbol denotes a convolution operation;
Figure BDA0003375714860000046
representing the real and imaginary parts of the complex number respectively,
Figure BDA0003375714860000047
weight matrix W for the first convolutional layerlThe real part of (a) is,
Figure BDA0003375714860000048
is WlAn imaginary part of (d);
Figure BDA0003375714860000049
output a for the l-1 layerl-1The real part of (a) is,
Figure BDA00033757148600000410
is al-1An imaginary part of (d);
Figure BDA00033757148600000411
is bias of the l-th layer blThe real part of (a) is,
Figure BDA00033757148600000412
is b islThe imaginary part of (c). When l is equal to 1, the ratio of the total of the two,
Figure BDA00033757148600000413
that is, in the first convolutional layer, the input is
Figure BDA00033757148600000414
j denotes an imaginary unit.
The convolutional layer back propagation process of CV-UNet + + is as follows: if the error term of the first convolution layer is deltalThen the error term for layer l-1 can be derived as
Figure BDA00033757148600000415
Where σ' (. cndot.) represents the derivative of the real domain activation function σ (); symbol [ ] represents a Hadamard product; rot180 (-) represents a 180 degree rotation of the matrix.
The gradient of the first convolutional layer weight is
Figure BDA00033757148600000416
Figure BDA0003375714860000051
Then the weight update formula of the convolutional layer is as follows:
Figure BDA0003375714860000052
Figure BDA0003375714860000053
s4, sea clutter suppression of wide-range scene SAR
Firstly, the wide scene SAR is cut according to the size of W × H-128 × 128, and a test slice set is formed.
It should be noted that the cropping of the wide-width scene SAR here is not a sliding window, and there is no overlap between the test slices.
Each test slice S was processed: randomly selecting N clutter slices (C) in clutter slice set for CV-UNet + + training1,C2,...,CN) And the test slice S and the N clutter slices (C) are processed according to the C2C strategy1,C2,...,CN) Are subtracted to obtain
Figure BDA0003375714860000054
Will be provided with
Figure BDA0003375714860000055
Inputting the data into trained CV-UNet + + to obtain N slices (T) with clutter suppressed1,T2,...,TN) Get it
Figure BDA0003375714860000056
To test the final clutter suppression effect map of the slice S.
And performing the processing on all the slices in the test slice set, and splicing the slices subjected to clutter suppression into a final wide-range scene SAR sea clutter suppression output.
Under the condition that the clutter intensity is weak, N can be 1, and under the condition that the sea clutter reflection is strong, N can be increased according to the condition.
The invention has the beneficial effects that:
(a) the invention innovatively provides an SAR sea clutter suppression framework based on an automatic supervision training strategy C2C. The frame does not need a clean clutter-free SAR image or a repeated observation SAR image from the same scene, so that a good clutter suppression effect can be realized, and the defect that a supervised learning needs a clean clutter-free SAR image is overcome.
(b) The invention innovatively provides a complex number domain deep learning network CV-UNet + +, uses the CV-UNet + + for sea clutter suppression, fully utilizes the amplitude and phase information of the SAR image, and has better clutter suppression performance compared with a real number domain UNet + + network.
(c) The invention utilizes the deep learning technology, obviously improves the effect of inhibiting the SAR sea clutter, better retains the complete characteristics of the interested target and is beneficial to the subsequent SAR image processing process.
The second embodiment is as follows:
the embodiment is an auto-supervision SAR sea clutter suppression system based on C2C, and the system is used for executing the auto-supervision SAR sea clutter suppression method based on C2C.
The third concrete implementation mode:
the embodiment is a storage medium, and at least one instruction is stored in the storage medium and loaded and executed by a processor to implement the method for self-supervision SAR sea clutter suppression based on C2C.
The fourth concrete implementation mode:
the embodiment is an apparatus, which comprises a processor and a memory, wherein the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the C2C-based self-supervision SAR sea clutter suppression method.
Examples
The sea clutter suppression method is carried out according to the method of the first embodiment, and comprises three parts of sample base establishment, C2C strategy and wide-range scene SAR sea clutter suppression.
The specific steps of the sample library establishing part are as follows:
(1) and (4) selecting an interested target in the SAR image, mainly a ship target. The region where the interested target is located is a target region, and the region where the background clutter is located is a clutter region;
(2) randomly selecting a pixel point in a target area in the SAR image as a central point, and cutting a target slice with the size of 128 multiplied by 128 to form a target slice set;
(3) and randomly cutting clutter slices with the same size in the clutter area to form a clutter slice set, wherein the clutter slices cannot be overlapped with the target area.
The method for self-supervision SAR sea clutter suppression based on C2C comprises the following specific steps:
1) taking the R0 area shown in FIG. 3 as a training area, and generating a target slice set and a clutter slice set according to the steps of establishing a part of a sample library;
2) training the CV-UNet + + network according to a C2C strategy;
3) taking the R1 and R2 regions shown in fig. 3 as test regions, and cutting the wide-width scene SAR according to the size of W × H-128 × 128 to form a test slice set;
4) selecting a test slice, and randomly selecting N clutter slices (C) in the clutter slice set1,C2,...,CN) And the test slice S and the N clutter slices (C) are processed according to the C2C strategy1,C2,...,CN) Are subtracted to obtain
Figure BDA0003375714860000061
5) Will be provided with
Figure BDA0003375714860000062
Inputting the data into trained CV-UNet + + to obtain N slices (T) with clutter suppressed1,T2,...,TN) Get it
Figure BDA0003375714860000071
A final clutter suppression effect map of the test slice S is obtained;
6) if the test slice set is completely processed, turning to the step 7, otherwise, turning to the step 4;
7) and splicing all processing results of the test slice set into a wide-range scene SAR image in sequence, and outputting the wide-range scene SAR image as a final clutter suppression result, wherein the effect is shown in fig. 4.
Clutter suppression was performed on test areas R1 and R2 using UNet + + and CV-UNet + +, respectively, and the results are shown in fig. 4 below. The signal-to-noise ratios are used to compare the performance of each clutter suppression method, and the statistical results of the signal-to-noise ratios after clutter suppression of each method are shown in table 1 below, where table 1 is the statistical (dB) results of the signal-to-noise ratios of UNet + + and CV-UNet + +. The qualitative analysis of FIG. 4 shows that both UNet + + and CV-UNet + + have better suppression effect on background clutter, wherein the CV-UNet + + effect is better. We note that CV-UNet + + suppresses clutter energy while preserving target energy. Compared with the traditional method, the CV-UNet + + trained by the C2C strategy furthest reserves the originality and integrity of target information and hardly causes loss on the information contained in the target, even the information of the defocused part of the target. The advantages of the C2C strategy and the deep learning network for SAR sea clutter suppression are fully embodied. As can be quantitatively analyzed from Table 1, the clutter suppressed images of the deep learning method have a higher signal-to-noise ratio, which is the largest in CV-UNet + +. In conclusion, the CV-UNet + + trained by the C2C strategy has better clutter suppression performance no matter from qualitative or quantitative analysis.
TABLE 1 UNet + + to CV-UNet + + signal to clutter ratio statistics (dB)
Figure BDA0003375714860000072
In the configuration of
Figure BDA0003375714860000073
Xeon (R) Silver 4110CPU, and NVIDIA GeForce RTX 1080 computer graphics card for statistical analysis of the computational efficiency of different methods. The processing time of each method for the test regions R1 and R2 is counted, and the results are shown in table 2 below, where table 2 shows that the calculation efficiency statistics(s) of UNet + + and CV-UNet + +. From table 2, the processing time of two deep learning methods, UNet + + and CV-UNet + + is substantially equivalent to that of the conventional method. Due to the complexity of complex arithmetic, the processing time of CV-UNet + + is about twice that of UNet + +.
TABLE 2 UNet + + and CV-UNet + + statistics of computational efficiency(s)
Figure BDA0003375714860000074
The present invention is capable of other embodiments and its several details are capable of modifications in various obvious respects, all without departing from the spirit and scope of the present invention.

Claims (10)

1. The self-supervision SAR sea clutter suppression method based on C2C is characterized by comprising the following steps:
firstly, cutting a wide scene SAR to be suppressed to form a test slice set;
each test slice S is then processed: randomly selecting N clutter slices (C) in clutter slice set for CV-UNet + + training1,C2,...,CN) And the test slice S and the N clutter slices (C) are processed according to the C2C strategy1,C2,...,CN) Are subtracted to obtain
Figure FDA0003375714850000011
Will be provided with
Figure FDA0003375714850000012
Inputting the data into trained CV-UNet + + to obtain N slices (T) with clutter suppressed1,T2,...,TN) Get it
Figure FDA0003375714850000013
A final clutter suppression effect map of the test slice S is obtained;
the training process of CV-UNet + + comprises the following steps:
s1, establishing a sample library:
dividing an SAR image for training into two areas, wherein the area where an interested target is located is a target area, and the area where a background clutter is located is a clutter area;
randomly selecting a pixel point in a target area in the SAR image as a central point, and cutting the pixel point into W multiplied by H target slices to form a target slice set;
randomly cutting clutter slices with the same size in the clutter area to form a clutter slice set, wherein the clutter slices cannot be overlapped with the target area;
strategy S2, C2C:
randomly selecting a target slice S in the target slice set, and then randomly selecting a clutter slice setTwo clutter slices are taken1And C2And constructing a complex domain SAR image pair
Figure FDA0003375714850000014
S3, taking CV-UNet + + as clutter suppression model phiθTraining clutter suppression model phi by using complex domain SAR imageθAnd obtaining the trained CV-UNet + + network.
2. The C2C-based self-supervision SAR sea clutter suppression method according to claim 1, wherein in the process of clipping the wide scene SAR to be suppressed, the wide scene SAR is clipped according to a dimension of wxh-128 × 128.
3. The C2C-based self-supervision SAR sea clutter suppression method according to claim 2, wherein the wide-width scene SAR to be suppressed is cut without overlapping between the test slices.
4. The C2C-based self-supervised SAR sea clutter suppression method according to claim 1, 2 or 3, wherein the training clutter suppression model φθUsing a mean square error loss function to optimize phi in the processθAs shown below
Figure FDA0003375714850000015
Wherein θ represents φθThe parameter (c) of (c).
5. The C2C-based self-supervised SAR sea clutter suppression method according to claim 4, wherein the CV-UNet + + convolutional layer forward propagation process is as follows: output a of the previous layerl-1Weight matrix W with the first convolutional layerlConvolving and applying an offset blTo obtain zl,zlActivating a function via complex fields
Figure FDA0003375714850000021
Activating to obtain the output a of the convolutional layerlThe calculation process can be expressed as the following formula
Figure FDA0003375714850000022
Wherein the complex field activation function
Figure FDA0003375714850000023
The expression is respectively activated for the real part and the imaginary part of the complex number; σ (·) represents a real number domain activation function, and symbol · represents a convolution operation;
Figure FDA0003375714850000024
representing the real and imaginary parts of the complex number respectively,
Figure FDA0003375714850000025
weight matrix W for the first convolutional layerlThe real part of (a) is,
Figure FDA0003375714850000026
is WlAn imaginary part of (d);
Figure FDA0003375714850000027
output a for the l-1 layerl-1The real part of (a) is,
Figure FDA0003375714850000028
is al-1An imaginary part of (d);
Figure FDA0003375714850000029
is bias of the l-th layer blThe real part of (a) is,
Figure FDA00033757148500000210
is b islAn imaginary part of (d); when l is equal to 1, the ratio of the total of the two,
Figure FDA00033757148500000211
that is, in the first convolutional layer, the input is
Figure FDA00033757148500000212
j denotes an imaginary unit.
6. The C2C-based self-supervised SAR sea clutter suppression method according to claim 5, wherein the real number domain activation function uses a leak-ReLU.
7. The self-supervision SAR sea clutter suppression method based on C2C, according to claim 6, wherein in the CV-UNet + + training process, the convolutional layer back propagation process is: if the error term of the first convolution layer is deltalThen the error term of layer l-1 is
Figure FDA00033757148500000213
Where σ' (. cndot.) represents the derivative of the real domain activation function σ (); symbol [ ] represents a Hadamard product; rot180(·) represents a 180 degree rotation of the matrix;
the gradient of the first convolutional layer weight is
Figure FDA00033757148500000214
Figure FDA00033757148500000215
The weight updating formula of the convolutional layer is as follows:
Figure FDA0003375714850000031
Figure FDA0003375714850000032
8. the C2C-based self-supervised SAR sea clutter suppression system, characterized in that the system is configured to perform the C2C-based self-supervised SAR sea clutter suppression method of one of claims 1 to 7.
9. A storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the C2C-based self-supervised SAR sea clutter suppression method according to one of claims 1 to 7.
10. An apparatus comprising a processor and a memory, the memory having stored therein at least one instruction that is loaded and executed by the processor to implement the C2C-based self-supervised SAR sea clutter suppression method according to one of claims 1 to 7.
CN202111417460.0A 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device Active CN114114261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111417460.0A CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111417460.0A CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Publications (2)

Publication Number Publication Date
CN114114261A true CN114114261A (en) 2022-03-01
CN114114261B CN114114261B (en) 2024-07-02

Family

ID=80373746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111417460.0A Active CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Country Status (1)

Country Link
CN (1) CN114114261B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228609A (en) * 2023-05-10 2023-06-06 中国人民解放军国防科技大学 Radar image speckle filtering method and device based on zero sample learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740549A (en) * 2019-01-08 2019-05-10 西安电子科技大学 SAR image object detection system and method based on semi-supervised CNN
CN110824473A (en) * 2019-10-21 2020-02-21 西北工业大学 Subspace-based high-resolution wide swath SAR-GMTI clutter suppression method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740549A (en) * 2019-01-08 2019-05-10 西安电子科技大学 SAR image object detection system and method based on semi-supervised CNN
CN110824473A (en) * 2019-10-21 2020-02-21 西北工业大学 Subspace-based high-resolution wide swath SAR-GMTI clutter suppression method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
S. GUO ET AL.: "Sea clutter and target detection with deep neural networks", PROC. 2ND INT. CONF. ARTIF. INTELL. ENG. APPL. (AIEA), 31 December 2017 (2017-12-31) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228609A (en) * 2023-05-10 2023-06-06 中国人民解放军国防科技大学 Radar image speckle filtering method and device based on zero sample learning
CN116228609B (en) * 2023-05-10 2023-07-21 中国人民解放军国防科技大学 Radar image speckle filtering method and device based on zero sample learning

Also Published As

Publication number Publication date
CN114114261B (en) 2024-07-02

Similar Documents

Publication Publication Date Title
CN110927706B (en) Convolutional neural network-based radar interference detection and identification method
CN110148088B (en) Image processing method, image rain removing method, device, terminal and medium
CN112184693A (en) Intelligent detection method for weld defects of ray industrial negative
CN103824088A (en) SAR target variant recognition method based on multi-information joint dynamic sparse representation
CN113837974A (en) NSST (non-subsampled contourlet transform) domain power equipment infrared image enhancement method based on improved BEEPS (Bayesian particle swarm optimization) filtering algorithm
CN113673312B (en) Deep learning-based radar signal intra-pulse modulation identification method
CN109712128A (en) Feature point detecting method, device, computer equipment and storage medium
CN114781514A (en) Floater target detection method and system integrating attention mechanism
CN114114261A (en) C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and equipment
CN115546622A (en) Fish shoal detection method and system, electronic device and storage medium
Li et al. Single underwater image enhancement using integrated variational model
CN112731410A (en) Underwater target sonar detection method based on CNN
CN116091492B (en) Image change pixel level detection method and system
CN109285148B (en) Infrared weak and small target detection method based on heavily weighted low rank and enhanced sparsity
CN111539434A (en) Infrared weak and small target detection method based on similarity
Xiao et al. Underwater image classification based on image enhancement and information quality evaluation
CN113807206B (en) SAR image target identification method based on denoising task assistance
CN115578624A (en) Agricultural disease and pest model construction method, detection method and device
CN115497492A (en) Real-time voice enhancement method based on full convolution neural network
CN114862803A (en) Industrial image anomaly detection method based on fast Fourier convolution
CN114936570A (en) Interference signal intelligent identification method based on lightweight CNN network
Chen et al. Ultrasound image denoising with multi-shape patches aggregation based non-local means
Yuan et al. Segmentation-guided semantic-aware self-supervised denoising for SAR image
Joshi et al. Ship Detection from Despeckled Satellite Images using Deep Learning
CN110415190A (en) Method, apparatus and processor based on deep learning removal compression of images noise

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant