CN114114261B - C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device - Google Patents

C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device Download PDF

Info

Publication number
CN114114261B
CN114114261B CN202111417460.0A CN202111417460A CN114114261B CN 114114261 B CN114114261 B CN 114114261B CN 202111417460 A CN202111417460 A CN 202111417460A CN 114114261 B CN114114261 B CN 114114261B
Authority
CN
China
Prior art keywords
clutter
sar
clutter suppression
unet
slices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111417460.0A
Other languages
Chinese (zh)
Other versions
CN114114261A (en
Inventor
张云
化青龙
王军
冀振元
姜义成
徐丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202111417460.0A priority Critical patent/CN114114261B/en
Publication of CN114114261A publication Critical patent/CN114114261A/en
Application granted granted Critical
Publication of CN114114261B publication Critical patent/CN114114261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and equipment, and belongs to the technical field of radar image processing. In order to solve the problem that the existing deep learning technology cannot be effectively utilized to inhibit SAR clutter due to the lack of clutter-free real data in reality. Firstly, cutting a wide scene SAR to be suppressed to form a test slice set; randomly selecting N clutter slices from a clutter slice set for CV-UNet++ training, and subtracting the test slice S from the N clutter slices according to a C2C strategy to obtainAnd input into CV-UNet++, N clutter suppressed slices T i are obtainedTo test the final clutter suppression effect map for slice S. The SAR sea clutter suppression method is mainly used for SAR sea clutter suppression.

Description

C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device
Technical Field
The invention relates to a SAR sea clutter suppression method, and belongs to the technical field of radar image processing.
Background
The synthetic aperture radar (SYNTHETIC APERTURE RADAR, SAR) technology has indistinct advantages in the field of ocean monitoring, and the detection of sea surface targets, especially ship targets, by using the synthetic aperture radar has wide application prospects. However, sea clutter caused by strong reflections from the ocean surface is a typical problem for SAR image interpretation and target detection of interest. This problem can lead to degradation of SAR image quality and affect the performance of various applications of SAR remote sensing, such as ship target detection and classification. Several methods have been proposed to suppress sea clutter in SAR images, including multiview processing, smoothing, adaptive spatial domain filters, etc. However, manually selecting the appropriate parameters for conventional algorithms is not easy and it is difficult to balance between preserving different image features and removing clutter.
In order to solve the limitations of the conventional methods, some deep learning-based methods have been developed. Most of the methods adopt architecture based on convolutional neural network for training, convert SAR image clutter suppression into a regression task, and output clean clutter-free images in an end-to-end mode. The algorithm needs to perform supervised training on the clutter-free images and the clutter-free images, and the biggest problem is the lack of real data without clutter.
Disclosure of Invention
The method aims to solve the problem that SAR clutter cannot be suppressed by effectively utilizing the existing deep learning technology due to the lack of clutter-free real data in reality.
The self-supervision SAR sea clutter suppression method based on the C2C comprises the following steps:
Firstly, cutting a wide scene SAR to be suppressed to form a test slice set;
Each test slice S is then processed: n clutter slices (C 1,C2,...,CN) are randomly selected from a clutter slice set for CV-UNet++ training, and the test slice S is subtracted from the N clutter slices (C 1,C2,...,CN) according to a C2C strategy to obtain
Will beInputting into trained CV-UNet++, obtaining N clutter-suppressed slices (T 1,T2,...,TN), and takingA final clutter suppression effect diagram for the test slice S;
The training process of CV-UNet++ comprises the following steps:
S1, establishing a sample library:
Dividing an SAR image for training into two areas, wherein the area where an interested target is located is a target area, and the area where background clutter is located is a clutter area;
Randomly selecting a pixel point as a center point in a target area in the SAR image, and cutting the pixel point into W multiplied by H target slices to form a target slice set;
randomly cutting clutter slices with the same size in a clutter zone to form a clutter slice set, wherein the clutter slices cannot overlap with a target zone;
s2, C2C strategy:
Randomly selecting a target slice S in the target slice set, randomly selecting two clutter slices C 1 and C 2 in the clutter slice set, and constructing a complex domain SAR image pair
S3, taking CV-UNet++ as a clutter suppression model phi θ, and training the clutter suppression model phi θ by utilizing a complex domain SAR image pair to obtain a trained CV-UNet++ network.
Further, in the process of clipping the wide-range scene SAR to be suppressed, clipping the wide-range scene SAR according to the size of w×h=128×128.
Further, when the wide scene SAR to be suppressed is cut, no overlapping exists among all the test slices.
Further, the mean square error loss function is used to optimize φ θ in the process of training the clutter suppression model φ θ, as follows
Wherein θ represents a parameter of phi θ.
The self-supervision SAR sea clutter suppression system based on the C2C is used for executing the self-supervision SAR sea clutter suppression method based on the C2C.
A storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the C2C-based self-supervised SAR sea clutter suppression method.
An apparatus comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement the C2C-based self-supervised SAR sea clutter suppression method. The invention has the beneficial effects that:
(a) The invention creatively provides a SAR sea clutter suppression framework based on a self-supervision training strategy C2C. The framework can realize a better clutter suppression effect without a clean clutter-free SAR image or a repeated observation SAR image from the same scene, and overcomes the defect that the clean clutter-free SAR image is required for supervised learning.
(B) The invention innovatively provides a complex domain deep learning network CV-UNet++, and uses CV-UNet++ for sea clutter suppression, so that the amplitude and phase information of SAR images are fully utilized, and compared with a real domain UNet++ network, the CV-UNet++ network has better clutter suppression performance.
(C) The SAR sea clutter suppression method and the SAR sea clutter suppression device have the advantages that the deep learning technology is utilized, the SAR sea clutter suppression effect is remarkably improved, the complete characteristics of the interested target are better reserved, and the SAR sea clutter suppression method and the SAR sea clutter suppression device are beneficial to the follow-up SAR image processing process.
Drawings
FIG. 1 is a SAR sea clutter suppression flow chart based on a self-supervised training strategy C2C;
FIG. 2 is a CV-UNet++ architecture diagram;
FIG. 3 is a schematic diagram of training area test area selection, wherein R0 is the training area and R1-R2 are the test areas;
Fig. 4 is a clutter suppression result. Wherein, (a 1) is an original image of the test region R1, and (b 1) - (c 1) are clutter suppression results of UNet++ and CV-UNet++ on R1; (a2) Is the original image of the test region R2, (b 2) - (c 2) are clutter suppression results for UNet++ and CV-UNet++ on R1.
Detailed Description
The first embodiment is as follows: the present embodiment will be described with reference to figure 1,
In order to solve the limitations of the traditional method and the deep learning method, the invention provides a method for SAR sea clutter suppression based on a new self-supervision training strategy C2C, which does not need to only contain a target but not have any label of clutter, and does not need to repeatedly observe SAR images from the same scene; meanwhile, in order to fully utilize the amplitude and phase information of SAR images, the invention trains a CV-unet++ network model by using a C2C strategy to carry out sea clutter suppression, has a better clutter suppression effect, and can well reserve the complete information of an interested target.
The self-supervision SAR sea clutter suppression method based on the C2C in the embodiment comprises the following steps:
S1, establishing a sample library
Firstly, dividing an SAR image into two areas, wherein the area where an interested target is located is a target area, and the area where background clutter is located is a clutter area.
And randomly selecting a pixel point from a target area in the SAR image as a central point, and cutting a target slice with the size of 128×128 to form a target slice set.
Clutter slices of the same size are randomly cut in the clutter region to form a clutter slice set, and the clutter slices cannot overlap with the target region.
It should be noted that the target slice contains both target and background clutter, while the clutter slice contains only background clutter.
S2, C2C strategy
Randomly selecting a target slice S in the target slice set, randomly selecting two clutter slices C 1 and C 2 in the clutter slice set, and constructing a complex domain SAR image pair
S3, designing CV-UNet++ as a clutter suppression model phi θ for fully utilizing the amplitude and phase information of the SAR image. The CV-UNet++ input isThe output is the slice after clutter suppression.
CV-UNet++ network As shown in FIG. 2, CV-UNet++ is an extension of UNet++ to complex domain, and the convolution layer and activation function in CV-UNet++ belong to complex domain.
The clutter suppression network phi θ is trained, and a label which only contains the target and does not have any clutter is not needed during training, and the label can be in the coexistence of clutter and the target. Optimization of phi θ using a Mean Square Error (MSE) loss function during training, as follows
Wherein θ represents a parameter of phi θ.
The forward propagation process of the CV-UNet++ convolution layer is as follows: the output a l-1 of the previous layer is convolved with the weight matrix W l of the first convolution layer, and added with the offset b l to obtain a complex domain activation function of z l,zl The output a l of the convolution layer is activated, and the calculation process can be expressed as the following formula
Wherein the complex domain activation functionExpressed as activating the real and imaginary parts of the complex number respectively; sigma (·) represents the real-number domain activation function, the invention uses leakage-ReLU; symbols represent convolution operations; representing the real and imaginary parts of the complex numbers respectively, Is the real part of the weight matrix W l of the first convolutional layer,An imaginary part of W l; for the real part of the layer 1 output a l-1, An imaginary part of a l-1; Is the real part of the bias b l of the first layer, Is the imaginary part of b l. When l=1, the number of the cells,I.e. in the first convolution layer, the input isJ represents an imaginary unit.
The back propagation process of the CV-UNet++ convolution layer is as follows: if the error term of the first convolution layer is delta l, the error term of the l-1 layer can be deduced as
Wherein σ' (·) represents the derivative of the real number domain activation function σ (·); the symbol ". As used herein, hadamard product; rot180 (·) represents a matrix rotation of 180 degrees.
The gradient of the weight of the first convolution layer is
The weight update formula of the convolution layer is as follows:
S4, SAR sea clutter suppression in wide-range scene
The wide scene SAR is first cropped to form a set of test slices according to the size w×h=128×128.
It should be noted that the cropping of the wide-range scene SAR is not a sliding window here, and there is no overlap between the test slices.
Each test slice S is processed: n clutter slices (C 1,C2,...,CN) are randomly selected from a clutter slice set for CV-UNet++ training, and the test slice S is subtracted from the N clutter slices (C 1,C2,...,CN) according to a C2C strategy to obtain
Will beInputting into trained CV-UNet++, obtaining N clutter-suppressed slices (T 1,T2,...,TN), and takingTo test the final clutter suppression effect map for slice S.
And executing the processing on all slices in the test slice set, and splicing the slices after clutter suppression into a final wide-range scene SAR sea clutter suppression output.
When the clutter intensity is weak, N may be 1, and when the sea clutter reflection is strong, N may be increased as appropriate.
The invention has the beneficial effects that:
(a) The invention creatively provides a SAR sea clutter suppression framework based on a self-supervision training strategy C2C. The framework can realize a better clutter suppression effect without a clean clutter-free SAR image or a repeated observation SAR image from the same scene, and overcomes the defect that the clean clutter-free SAR image is required for supervised learning.
(B) The invention innovatively provides a complex domain deep learning network CV-UNet++, and uses CV-UNet++ for sea clutter suppression, so that the amplitude and phase information of SAR images are fully utilized, and compared with a real domain UNet++ network, the CV-UNet++ network has better clutter suppression performance.
(C) The SAR sea clutter suppression method and the SAR sea clutter suppression device have the advantages that the deep learning technology is utilized, the SAR sea clutter suppression effect is remarkably improved, the complete characteristics of the interested target are better reserved, and the SAR sea clutter suppression method and the SAR sea clutter suppression device are beneficial to the follow-up SAR image processing process.
The second embodiment is as follows:
the embodiment is a C2C-based self-monitoring SAR sea clutter suppression system, which is configured to execute the C2C-based self-monitoring SAR sea clutter suppression method.
And a third specific embodiment:
The embodiment is a storage medium, in which at least one instruction is stored, where the at least one instruction is loaded and executed by a processor to implement the C2C-based self-supervised SAR sea clutter suppression method.
The specific embodiment IV is as follows:
The embodiment is an apparatus comprising a processor and a memory, wherein at least one instruction is stored in the memory, and the at least one instruction is loaded and executed by the processor to implement the C2C-based self-supervision SAR sea clutter suppression method.
Examples
The sea clutter suppression is carried out according to the method of the first specific embodiment, and comprises three parts of sample library establishment, C2C strategy and wide-range scene SAR sea clutter suppression.
The sample library establishment part comprises the following specific steps:
(1) And framing the target of interest in the SAR image, mainly a ship target. The region where the interested target is located is a target region, and the region where the background clutter is located is a clutter region;
(2) Randomly selecting a pixel point as a central point in a target area in the SAR image, and cutting a target slice with the size of 128 multiplied by 128 to form a target slice set;
(3) Clutter slices of the same size are randomly cut in the clutter region to form a clutter slice set, and the clutter slices cannot overlap with the target region.
The specific steps of the SAR wide-range scene sea clutter suppression part in the self-supervision SAR sea clutter suppression method based on C2C are as follows:
1) Using the R0 area shown in FIG. 3 as a training area, and generating a target slice set and a clutter slice set according to the steps of the sample library establishment part;
2) Training a CV-UNet++ network according to a C2C strategy;
3) Taking the R1 and R2 areas shown in FIG. 3 as test areas, and cutting the wide scene SAR according to the size of W×H=128×128 to form a test slice set;
4) Selecting a test slice, randomly selecting N clutter slices (C 1,C2,...,CN) in the clutter slice set, and subtracting the test slice S from the N clutter slices (C 1,C2,...,CN) according to the C2C strategy to obtain
5) Will beInputting into trained CV-UNet++, obtaining N clutter-suppressed slices (T 1,T2,...,TN), and takingA final clutter suppression effect diagram for the test slice S;
6) If the test slice set is completely processed, turning to the step 7, otherwise turning to the step 4;
7) And all processing results of the test slice set are spliced into a wide scene SAR image in sequence and output as a final clutter suppression result, and the effect is shown in figure 4.
Clutter suppression was performed on the test areas R1 and R2 using UNet++ and CV-UNet++, respectively, as shown in FIG. 4 below. The signal-to-noise ratio was used to compare the performance of the various clutter suppression methods, and the post clutter suppression signal-to-noise ratio statistics for the various methods are shown in Table 1 below, with Table 1 being the UNet++ and CV-UNet++ signal-to-noise ratio statistics (dB) results. As can be seen from the qualitative analysis of FIG. 4, both the UNet++ and CV-UNet++ have better inhibition effect on background clutter, wherein CV-UNet++ has better effect. We note that CV-unet++ suppresses clutter energy while preserving target energy. Compared with the traditional method, CV-UNet++ trained by the C2C strategy keeps the originality and the integrity of target information to the greatest extent, and almost no loss is caused to the information contained in the target, even the information of a defocused part of the target. The advantages of the C2C strategy and the deep learning network for SAR sea clutter suppression are fully reflected. From the quantitative analysis in Table 1, the image after clutter suppression by the deep learning method has higher signal-to-clutter ratio, and the signal-to-clutter ratio is the largest with CV-UNet++. In summary, CV-UNet++ trained by the C2C strategy has better clutter suppression performance, both qualitatively and quantitatively.
TABLE 1 UNet ++ vs CV-UNet++ signal-to-noise ratio statistics (dB)
Configured asThe computing efficiency of different methods is statistically analyzed on a computer with a display card NVIDIA GeForce RTX and a display card NVIDIA GeForce RTX by Xeon (R) Silver 4110 CPU. The processing times of each method on the test areas R1 and R2 were counted, and the results are shown in Table 2 below, table 2 being the decomposition of the statistics(s) of the computational efficiencies of UNet++ and CV-UNet++. As can be seen from table 2, the processing time of the two deep learning methods of unet++ and CV-unet++ is substantially equivalent to that of the conventional method. Due to the complexity of complex operations, CV-UNet++ is approximately twice as long as UNet++.
Table 2 UNet ++ and CV-unet++ calculation efficiency statistics(s)
The present invention is capable of other and further embodiments and its several details are capable of modification and variation in light of the present invention, as will be apparent to those skilled in the art, without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. The self-supervision SAR sea clutter suppression method based on the C2C is characterized by comprising the following steps of:
Firstly, cutting a wide scene SAR to be suppressed to form a test slice set;
Each test slice S is then processed: n clutter slices (C 1,C2,...,CN) are randomly selected from a clutter slice set for CV-UNet++ training, and the test slice S is subtracted from the N clutter slices (C 1,C2,...,CN) according to a C2C strategy to obtain
Will beInputting into trained CV-UNet++, obtaining N clutter-suppressed slices (T 1,T2,...,TN), and takingA final clutter suppression effect diagram for the test slice S;
The training process of CV-UNet++ comprises the following steps:
S1, establishing a sample library:
Dividing an SAR image for training into two areas, wherein the area where an interested target is located is a target area, and the area where background clutter is located is a clutter area;
Randomly selecting a pixel point as a center point in a target area in the SAR image, and cutting the pixel point into W multiplied by H target slices to form a target slice set;
randomly cutting clutter slices with the same size in a clutter zone to form a clutter slice set, wherein the clutter slices cannot overlap with a target zone;
s2, C2C strategy:
Randomly selecting a target slice S in the target slice set, randomly selecting two clutter slices C 1 and C 2 in the clutter slice set, and constructing a complex domain SAR image pair
S3, taking CV-UNet++ as a clutter suppression model phi θ, and training the clutter suppression model phi θ by utilizing a complex domain SAR image pair to obtain a trained CV-UNet++ network.
2. The C2C-based self-supervision SAR sea clutter suppression method according to claim 1, wherein in the clipping of the wide-range scene SAR to be suppressed, the clipping of the wide-range scene SAR is performed according to the size of w×h=128×128.
3. The C2C-based self-supervised SAR sea clutter suppression method of claim 2, wherein there is no overlap between the test slices when clipping the wide-range scene SAR to be suppressed.
4. A C2C-based self-supervised SAR sea clutter suppression method according to claim 1,2 or 3, wherein the training clutter suppression model Φ θ uses a mean square error loss function to optimize Φ θ in the process as follows
Wherein θ represents a parameter of phi θ.
5. The C2C-based self-supervised SAR sea clutter suppression method of claim 4, wherein the convolutional layer forward propagation process of CV-unet++ is: the output a l-1 of the previous layer is convolved with the weight matrix W l of the first convolution layer, and added with the offset b l to obtain a complex domain activation function of z l,zl The output a l of the convolution layer is activated, and the calculation process can be expressed as the following formula
Wherein the complex domain activation functionExpressed as activating the real and imaginary parts of the complex number respectively; sigma (·) represents the real-number domain activation function, and the symbols represent the convolution operation; representing the real and imaginary parts of the complex numbers respectively, Is the real part of the weight matrix W l of the first convolutional layer,An imaginary part of W l; for the real part of the layer 1 output a l-1, An imaginary part of a l-1; Is the real part of the bias b l of the first layer, An imaginary part of b l; when l=1, the number of the cells,I.e. in the first convolution layer, the input isJ represents an imaginary unit.
6. The C2C-based self-supervised SAR sea clutter suppression method of claim 5, wherein the real number domain activation function uses a leak-ReLU.
7. The C2C-based self-supervised SAR sea clutter suppression method of claim 6, wherein in training CV-unet++, the convolutional layer counter-propagation process is: if the error term of the first convolution layer is delta l, the error term of the l-1 layer is
Wherein σ' (·) represents the derivative of the real number domain activation function σ (·); the symbol ". As used herein, hadamard product; rot180 (·) represents a matrix rotation of 180 degrees;
the gradient of the weight of the first convolution layer is
The weight update formula of the convolution layer is as follows:
8. C2C-based self-supervised SAR sea clutter suppression system, characterized in that the system is adapted to perform the C2C-based self-supervised SAR sea clutter suppression method of one of claims 1 to 7.
9. A storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the C2C-based self-supervised SAR sea clutter suppression method of any of claims 1 to 7.
10. An apparatus comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement the C2C-based self-supervised SAR sea clutter suppression method of any of claims 1 to 7.
CN202111417460.0A 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device Active CN114114261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111417460.0A CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111417460.0A CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Publications (2)

Publication Number Publication Date
CN114114261A CN114114261A (en) 2022-03-01
CN114114261B true CN114114261B (en) 2024-07-02

Family

ID=80373746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111417460.0A Active CN114114261B (en) 2021-11-25 2021-11-25 C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device

Country Status (1)

Country Link
CN (1) CN114114261B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228609B (en) * 2023-05-10 2023-07-21 中国人民解放军国防科技大学 Radar image speckle filtering method and device based on zero sample learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740549A (en) * 2019-01-08 2019-05-10 西安电子科技大学 SAR image object detection system and method based on semi-supervised CNN
CN110824473A (en) * 2019-10-21 2020-02-21 西北工业大学 Subspace-based high-resolution wide swath SAR-GMTI clutter suppression method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740549A (en) * 2019-01-08 2019-05-10 西安电子科技大学 SAR image object detection system and method based on semi-supervised CNN
CN110824473A (en) * 2019-10-21 2020-02-21 西北工业大学 Subspace-based high-resolution wide swath SAR-GMTI clutter suppression method

Also Published As

Publication number Publication date
CN114114261A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
Xu et al. A systematic review and analysis of deep learning-based underwater object detection
US20240161479A1 (en) Polarized Image Enhancement using Deep Neural Networks
CN114114261B (en) C2C-based self-supervision SAR sea clutter suppression method, system, storage medium and device
CN110084181B (en) Remote sensing image ship target detection method based on sparse MobileNet V2 network
Singh et al. A comparative analysis of illumination estimation based Image Enhancement techniques
CN113052006A (en) Image target detection method and system based on convolutional neural network and readable storage medium
Wang et al. Proposal-Copula-Based Fusion of Spaceborne and Airborne SAR Images for Ship Target Detection⁎⁎
CN115546622A (en) Fish shoal detection method and system, electronic device and storage medium
Liu et al. SAR image specle reduction based on a generative adversarial network
Singh et al. Review on nontraditional perspectives of synthetic aperture radar image despeckling
CN112731410B (en) Underwater target sonar detection method based on CNN
Xiao et al. Underwater image classification based on image enhancement and information quality evaluation
Mohana Dhas et al. Blood cell image denoising based on tunicate rat swarm optimization with median filter
Wei et al. A lightweight underwater target detection network for seafood
CN109285148B (en) Infrared weak and small target detection method based on heavily weighted low rank and enhanced sparsity
CN116958792A (en) False alarm removing method for assisting SAR vehicle target detection
Gur et al. Autocorrelation based denoising of manatee vocalizations using the undecimated discrete wavelet transform
Zhang et al. GGADN: Guided generative adversarial dehazing network
Chen et al. Ultrasound image denoising with multi-shape patches aggregation based non-local means
Orescanin et al. A Study on the Effect of Commonly Used Data Augmentation Techniques on Sonar Image Artifact Detection Using Deep Neural Networks
Zhou et al. A lightweight object detection framework for underwater imagery with joint image restoration and color transformation
Mir A SURVEY OF MONTE CARLO DENOISING: CHALLENGES AND POSSIBLE SOLUTIONS.
Kaur et al. Deep learning with invariant feature based species classification in underwater environments
Lei et al. Multi-level residual attention network for speckle suppression
Busson et al. Seismic shot gather noise localization using a multi-scale feature-fusion-based neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant