CN110084210B - SAR image multi-scale ship detection method based on attention pyramid network - Google Patents
SAR image multi-scale ship detection method based on attention pyramid network Download PDFInfo
- Publication number
- CN110084210B CN110084210B CN201910362037.1A CN201910362037A CN110084210B CN 110084210 B CN110084210 B CN 110084210B CN 201910362037 A CN201910362037 A CN 201910362037A CN 110084210 B CN110084210 B CN 110084210B
- Authority
- CN
- China
- Prior art keywords
- attention
- channels
- layer
- convolution
- cbam
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the technical field of radar remote sensing, and relates to an SAR image multi-scale ship detection method based on an attention pyramid network. The invention provides a multi-scale feature extraction method with self-adaptive selection of significant features on the basis of the existing feature pyramid network, which comprises the following steps: and the characteristic pyramid network based on the intensive attention mechanism is applied to SAR image multi-scale ship target detection. Salient features are highlighted from the global range and the local range respectively through a channel attention model and a space attention model, and better detection performance is obtained; meanwhile, the attention mechanism is applied to the multi-scale fusion process of each layer, the characteristics can be enhanced layer by layer, the false alarm target can be effectively eliminated, and the detection precision is improved.
Description
Technical Field
The invention belongs to the technical field of radar remote sensing, and relates to an SAR image multi-scale ship detection method based on an attention pyramid network.
Background
In recent years, the detection of sea surface ship targets by using Synthetic Aperture Radar (SAR) images has become a hot spot of worldwide research. As a large ocean country, China has a long coastline and a vast sea area, the real-time monitoring of the ocean is carried out by utilizing SAR, and the development of the ship target detection research based on the SAR image has important significance for guaranteeing the national security and maintaining the ocean rights and interests of China.
At present, the targets of ships are various in types and different in size. Due to the fact that the ship with different scales has larger difference, the large-scale ship occupies more pixels in the SAR image, the small-scale ship target occupies fewer pixels in the high-resolution SAR image, the contrast is low, and the detection difficulty is increased. The traditional ship detection algorithm is insensitive to small-scale ship targets, so that the performance of multi-scale ship detection aiming at the SAR image is poor, and ships with different scales in the SAR image cannot be detected simultaneously.
In recent years, Convolutional Neural Networks (CNNs) have enjoyed great success in the field of computer vision by virtue of their powerful characterization capabilities and the feature of automatic feature extraction. The CNN is popularized to the research of the ship detection method of the SAR image by various scholars, the excellent detection performance of the detector based on the CNN in the complex sea surface environment is shown, but at present, the CNN detectors suitable for the multi-scale ship detection of the SAR image are few. Aiming at the problem that most detectors based on the CNN only use the last layer of feature map for detection and lose spatial resolution information, a Feature Pyramid Network (FPN) fuses spatial resolution information and semantic information simultaneously through the operations of 'top-down' and 'transverse connection', and uses the obtained fused feature map for detection, thereby initially showing the detection performance of the CNN on multi-scale ships in SAR images. Although the FPN extracts the features of simultaneously fusing the spatial resolution information and the semantic information, the FPN has poor detection accuracy on multi-scale ships and easily causes missing detection because the extracted features are not rich enough and have no prominent features. Later, some scholars propose a method for densely connecting FPNs, and abundant target features are extracted in a dense connection mode, but in the face of huge features, the problems that the detection speed becomes slow and the false alarm rate becomes high exist. How to select the significant features in the abundant target extraction features in a self-adaptive manner, highlight the characteristics of ship targets with different scales and improve the accuracy of multi-scale ship detection in the SAR image is the problem existing at present.
Disclosure of Invention
Aiming at the problems or the defects, in order to overcome the defects that the existing SAR image ship detection method cannot adaptively select obvious target features from abundant global features in the face of extracted massive features, and the detection accuracy is reduced, the feature extraction link in the SAR image multi-scale ship detection has the capability of adaptively selecting the obvious features, and the characteristics of ships with different scales are highlighted. The invention provides an SAR image multi-scale ship detection method based on a Dense Attention Pyramid Network (DAPN).
The invention is realized by the following steps, and the whole flow of the ship detection algorithm is shown in the attached figure 1.
Step 1, sending SAR images to be detected into a detection network, and obtaining feature maps { C) of all layers in upward and downward forward network branches in a pyramid network through a shared convolution layer2、C3、C4、C5And extracting the global characteristics which are not blurred under different scales.
The process extracts global unambiguous characteristics at different scales for the original SAR image through a bottom-up forward network.
The basic network of the invention adopts a ResNet101 network, and a structure with a characteristic diagram with the same size is called a stage, and the output { C of the last residual block in each stage in the ResNet101 is used2、C3、C4、C5The forward network from bottom to top is formed.
In a bottom-up forward network, a high-level feature map has richer semantic information but lacks spatial resolution information, and is suitable for detecting large-scale ships. The low-level feature map is just opposite, has higher resolution, only contains shallow-level features, and is suitable for detecting small-scale ships. Thus for multi-scale vesselsThe detection of (2) is very important for effectively fusing the spatial resolution information and the semantic information. Feature map of bottom-up forward network in DAPN C2、C3、C4、C5And the method is mainly used for providing global features which are not blurred by subsequent upsampling and Convolution Block Attention Mechanism (CBAM) operation, and the unblurred global features can be used for detecting the rough positions of all suspected ship targets and reducing omission.
Step 2, { C2、C3、C4、C5Respectively reducing the dimensionality of a channel direction through 1 multiplied by 1 convolution, and transversely connecting the dimensionality with each layer of fusion characteristic diagram of the dense connection CBAM with the same size to obtain a network branch (P) from top to bottom2、P3、P4、P5}。
(1) Reduction { C2、C3、C4、C5Number of channels
Feature map C reduction for bottom-up forward networks using 256 1 x 1 convolution kernels2、C3、C4、C5Will { C } number of channels2、C3、C4、C5The numbers of 256 channels, 512 channels, 1024 channels and 2048 channels which correspond to the channels are reduced into 256 channels respectively, and a characteristic diagram { C ] obtained after the channels are reduced is obtainedR2、CR3、CR4、CR5The calculation process is as follows:
CRi=Conv1×1(Ci),i=2,3,4,5
(2){CR2、CR3、CR4、CR5transversely connecting the fusion characteristic diagram of the CBAM and the densely connected CBAM to obtain { P }2、P3、 P4、P5}。
In the process, the CBAM is used for extracting remarkable local features from the abundant features, so that the enhancement of the feature map layer by layer is realized, and the false alarm target is effectively eliminated. CBAM was proposed in 2018 and mainly consists of a channel attention mechanism and a spatial attention mechanism. Input feature mapAll-purposeThe number of tracks is C and the width and height of each channel image are W and H, respectively. The CBAM firstly obtains a channel attention diagram by an input feature diagram F through a channel attention mechanismMultiplying F by AC pixel level to obtain a feature map of the attention mechanism of the channelThe FC then derives a spatial attention map through a spatial attention mechanismFinally, multiplying FC and AS pixel levels to obtain a significant feature mapThe CBAM calculation process is summarized as:
The channel attention mechanism aggregates spatial information by using maximum pooling and average pooling operations simultaneously, then reduces parameters through a multilayer perceptron (MLP), and finally obtains a channel attention map AC, and the calculation process is shown as the following formula:
AC(F)=σ(MLP(AvgPool(F))+MLP(MaxPool(F))),
where σ represents the activation function and AvgPool and MaxPool represent the average pooling and maximum pooling operations, respectively.
The spatial attention mechanism first uses maximum pooling and average pooling in the channel direction and the results are comparedSplicing is carried out, and finally, a 7 multiplied by 7 convolution is used on the spliced feature map to obtain a space attention map ASThe calculation process can be summarized as follows:
AS(FC)=σ(Conv7×7([AvgPool(FC);MaxPool(FC)]))
wherein, Conv7×7Representing a 7 x 7 convolution operation.
The overall structure of the CBAM is shown in FIG. 2.
Network branch from top to bottom P2、P3、P4、P5The calculation procedure of } is as follows:
1) top layer feature map P5By reducing the number of channelsR5Obtained directly, according to the formula:
P5=CR5=Conv1×1(C5)
solution of where Conv1×1(.) represents a 1 × 1 convolution;
2) dense connection: all layers are higher than the current layer PiCharacteristic map of (1) { P }i+1,…,P5Upsampling to the size of the characteristic diagram of the current layer, and carrying out pixel-level addition to obtain a densely connected characteristic diagram FDiAccording to the formula:
solving, wherein Upesample (.) represents an upsampling operation;
3) CBAM: characteristic diagram F after dense connectionDiPerforming CBAM operation as input feature map to obtain significant feature map A of each layeriAccording to the formula:
Ai=A(FDi)
solving, wherein A (.) represents a CBAM operation;
the CBAM is composed of a channel attention mechanism and a space attention mechanism, and the characteristic diagram after dense connection is inputThe number of channels is C, the width and the height of each channel image are W and H respectively, and the CBAM firstly inputs a feature map FDiChannel attention map through channel attention mechanismFDiAnd ACObtaining a characteristic diagram of a channel attention mechanism after pixel-level multiplicationThen FCObtaining a spatial attention map by a spatial attention mechanismFinally FCAnd ASObtaining a salient feature map of each layer after pixel-level multiplicationThe CBAM is calculated by the following steps:
the channel attention mechanism aggregates spatial information by simultaneously utilizing maximum pooling and average pooling operations, and then reduces parameters through a multilayer perceptron to obtain a channel attention diagram ACThe calculation process is shown as the following formula:
AC(FDi)=σ(MLP(AvgPool(FDi))+MLP(MaxPool(FDi))),
where σ represents the activation function, AvgPool and MaxPool represent the average pooling and maximum pooling operations, respectively, and MLP represents the multilayer perceptron;
the spatial attention mechanism firstly uses maximum pooling and average pooling in the channel direction, splices the results, and finally uses a 7 multiplied by 7 convolution on the spliced characteristic diagram to obtain a spatial attention diagram ASThe calculation process is as follows:
AS(FC)=σ(Conv7×7([AvgPool(FC);MaxPool(FC)]))
wherein, Conv7×7Represents a 7 × 7 convolution operation;
4) transverse connection: { CR2、CR3、CR4Is related to the corresponding { A }2、A3、A4Carry on the pixel level addition, and reduce the aliasing effect brought by the up-sampling operation through the 3 x 3 convolution, get { P }2、P3、P4According to the formula:
Through the above operation, the network branch { P from top to bottom of DAPN can be obtained2、P3、P4、P5}。
The high-level feature map in the top-down network obtains a high-resolution feature map containing rich semantic information through an upsampling operation, then obtains a sufficiently rich multi-scale feature map through the dense addition of fusion feature maps of different layers, extracts significant local features in the rich features through CBAM, and can be used for finely identifying multi-scale ship targets in SAR images. The non-fuzzy global features are fused with the extracted significant local features through transverse connection, so that the target can be accurately identified on the premise of less missing detection of multi-scale ship detection, false alarm targets are effectively reduced, and the accuracy of SAR image multi-scale ship detection is improved.
The Dense Attention Pyramid Network (DAPN) is mainly composed of three parts, namely a bottom-up forward network, a horizontal connection and a top-down CBAM dense connection network. The structure of DAPN is shown in fig. 3.
And 3, sending the final fusion characteristic diagrams of different scales of each layer into a regional suggestion network (RPN), and detecting the SAR image multi-scale ship target by using fast R-CNN.
The invention provides a multi-scale feature extraction method with self-adaptive selection of significant features on the basis of the existing feature pyramid network, which comprises the following steps: and the characteristic pyramid network based on the intensive attention mechanism is applied to SAR image multi-scale ship target detection. Salient features are highlighted from the global range and the local range respectively through a channel attention model and a space attention model, and better detection performance is obtained; meanwhile, the attention mechanism is applied to the multi-scale fusion process of each layer, the characteristics can be enhanced layer by layer, the false alarm target can be effectively eliminated, and the detection precision is improved.
In conclusion, compared with the existing SAR image multi-scale ship target detection method, the method has the capability of extracting rich characteristics of ship targets with different scales, and improves the detection capability of small-scale ships; the capability of adaptively selecting the significant features of different scales can be realized through the intensive attention pyramid network, the false alarm target is effectively removed, and the detection precision is greatly improved; on the basis of accurately detecting the small-scale ship target, the detection accuracy is higher than that of the traditional method.
Drawings
FIG. 1 is a flow chart of a multi-scale ship target detection method of the present invention;
FIG. 2 is a flow chart of a Convolution Block Attention Mechanism (CBAM);
FIG. 3 is a structure of a Dense Attention Pyramid Network (DAPN);
fig. 4 is a schematic diagram of a detection result of the SAR image multi-scale ship of the present invention.
Detailed Description
The invention will be further explained below by using a ship data set SSDD provided by the naval aviation university of the chinese liberation army to perform multi-scale ship detection.
The data set used in the experiment is a SAR image ship detection data set (SSDD) comprising different types of SAR ship images. Table 1 shows the kind of SAR image in SSDD.
TABLE 1 SAR image categories in SSDD
Experiments training, validation and test sets were constructed at a ratio of 7:2: 1. The initial value of the learning rate is set to 0.001 and is attenuated once every 2000 steps at the attenuation rate of 0.1; the weight attenuation rate is 0.0001; the momentum value is set to 0.9; setting the scale of anchors to {16 }2,242,402,602,802And seven proportions of {1:1,1:2,1:3,2:1,2:3,3:1,3:2} are set in each scale to meet the requirements of ship detection in different scales. The multi-scale ship detection model with the obvious characteristics of self-adaptive selection of different scales is obtained in the experiment through a training process, and an original picture is input to carry out preprocessing work such as cutting, data enhancement and the like; sending the preprocessed pictures and the real ground object label files into a ResNet101 basic network to obtain a characteristic diagram { C ] of the DAPN from bottom to top in a forward network2、 C3、C4、C5And carrying out dense addition and CBAM (cubic boron nitride) on the feature maps by 1 multiplied by 1 convolution and up-sampling operation respectively to obtain a significant feature map, and transversely connecting the feature maps by pixel-level addition2、C3、C4、C5And obtaining a final fusion feature map { P }2、P3、P4、P5}; finally, the detection network utilizes fast R-CNN to obtain a final fusion characteristic graph { P }2、 P3、P4、P5And sending the data to fast R-CNN to obtain a final ship detection result.
TABLE 2 naval vessel survey accuracy comparison
Compared with the existing ship detection method, the ship detection precision of the invention is improved by more than 10%, and table 2 shows that the ship detection precision of the invention is compared with other existing SAR images, and the multi-scale ship detection result is compared with the attached figure 4. As can be seen from the figure, the DAPN detection precision of the multi-scale ship detection method based on the dense attention pyramid network is superior to that of the improved fast R-CNN and SSD detection methods, and the method not only can detect ships with different scales in one SAR image, but also can effectively remove false alarms and improve the detection precision.
Claims (1)
1. An attention pyramid network-based SAR image multi-scale ship detection method is characterized in that a dense attention pyramid network DAPN is adopted, characteristics of a synthetic aperture radar SAR image are extracted, and multi-scale ship detection is achieved, wherein the DAPN comprises a forward network from bottom to top, a transverse connection and a rolling block attention system CBAM dense connection network from top to bottom; the specific method comprises the following steps:
s1, obtaining feature maps of each layer in the forward network branch from top to bottom in DAPN by sharing convolution layer { C2、C3、C4、C5Extracting the global features which are not blurred under different scales;
S2、{C2、C3、C4、C5reducing the dimensionality of the channel direction by 1 × 1 convolution respectively, and performing transverse connection with each layer of fusion characteristic graph of the densely-connected CBAM with the same size to obtain a network branch { P) from top to bottom2、P3、P4、P5The method specifically comprises the following steps:
s21, reduction { C2、C3、C4、C5The number of channels:
bottom-up forward net reduction using 256 1 × 1 convolution kernelsCharacteristic graph of the collaterals { C2、C3、C4、C5Will { C } number of channels2、C3、C4、C5The numbers of 256 channels, 512 channels, 1024 channels and 2048 channels which correspond to the channels are reduced into 256 channels respectively, and a characteristic diagram { C ] after the channels are reduced is obtainedR2、CR3、CR4、CR5The calculation process is as follows:
CRi=Conv1×1(Ci),i=2,3,4,5
s22, top layer characteristic diagram P5By reducing the number of channelsR5Obtained directly, according to the formula:
P5=CR5=Conv1×1(C5)
solution of where Conv1×1(.) represents a 1 × 1 convolution;
s23, dense connection: all layers are higher than the current layer PiCharacteristic map of (1) { P }i+1,…,P5Upsampling to the size of the characteristic diagram of the current layer, and carrying out pixel-level addition to obtain a densely connected characteristic diagram FDiAccording to the formula:
solving, wherein Upesample (.) represents an upsampling operation;
s24, CBAM: densely connected feature maps F obtained for each layerDiCarrying out CBAM operation to obtain a significant characteristic map A of each layeriThe method comprises the following steps:
the CBAM is composed of a channel attention mechanism and a space attention mechanism, and the characteristic diagram after dense connection is inputThe number of channels is C, the width and the height of each channel image are W and H respectively, and the CBAM firstly inputs a feature map FDiChannel attention map through channel attention mechanismFDiAnd ACObtaining a characteristic diagram of a channel attention mechanism after pixel-level multiplicationThen FCObtaining a spatial attention map by a spatial attention mechanismFinally FCAnd ASObtaining a salient feature map of each layer after pixel-level multiplicationThe CBAM is calculated by the following steps:
wherein the content of the first and second substances,representing pixel-level multiplication, ACFor channel attention map, ASA spatial attention map is obtained;
the channel attention mechanism aggregates spatial information by simultaneously utilizing maximum pooling and average pooling operations, and then reduces parameters through a multilayer perceptron to obtain a channel attention diagram ACThe calculation process is shown as the following formula:
AC(FDi)=σ(MLP(AvgPool(FDi))+MLP(MaxPool(FDi))),
where σ represents the activation function, AvgPool and MaxPool represent the average pooling and maximum pooling operations, respectively, and MLP represents the multilayer perceptron;
space attention machineFirstly, maximum pooling and average pooling are used in channel direction, the results are spliced, and finally, a 7 multiplied by 7 convolution is used on a spliced feature map to obtain a space attention map ASThe calculation process is as follows:
AS(FC)=σ(Conv7×7([AvgPool(FC);MaxPool(FC)]))
wherein, Conv7×7Represents a 7 × 7 convolution operation;
s25, transverse connection: { CR2, CR3, CR4} and the corresponding { A2, A3, A4} are added at pixel level, and aliasing effects brought about by the upsampling operation are reduced by 3 × 3 convolution to obtain { P2, P3, P4}, according to the formula:
solving, wherein Conv3 × 3(.) represents a3 × 3 convolution, and obtaining the top-down network branch { P ] of DAPN through the operation2、P3、P4、P5};
S3, merging the final fusion feature maps { P ] of different scales of each layer2、P3、P4、P5And sending the data to a regional suggestion network to generate a suggestion region, and finally detecting the SAR image multi-scale ship target by using the Faster R-CNN.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910362037.1A CN110084210B (en) | 2019-04-30 | 2019-04-30 | SAR image multi-scale ship detection method based on attention pyramid network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910362037.1A CN110084210B (en) | 2019-04-30 | 2019-04-30 | SAR image multi-scale ship detection method based on attention pyramid network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110084210A CN110084210A (en) | 2019-08-02 |
CN110084210B true CN110084210B (en) | 2022-03-29 |
Family
ID=67418161
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910362037.1A Active CN110084210B (en) | 2019-04-30 | 2019-04-30 | SAR image multi-scale ship detection method based on attention pyramid network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110084210B (en) |
Families Citing this family (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110533084B (en) * | 2019-08-12 | 2022-09-30 | 长安大学 | Multi-scale target detection method based on self-attention mechanism |
CN110826588A (en) * | 2019-08-29 | 2020-02-21 | 天津大学 | Drainage pipeline defect detection method based on attention mechanism |
CN110782420A (en) * | 2019-09-19 | 2020-02-11 | 杭州电子科技大学 | Small target feature representation enhancement method based on deep learning |
CN110705457B (en) * | 2019-09-29 | 2024-01-19 | 核工业北京地质研究院 | Remote sensing image building change detection method |
CN110807372A (en) * | 2019-10-15 | 2020-02-18 | 哈尔滨工程大学 | Rapid optical remote sensing target identification method based on depth feature recombination |
CN110866907A (en) * | 2019-11-12 | 2020-03-06 | 中原工学院 | Full convolution network fabric defect detection method based on attention mechanism |
CN111179217A (en) * | 2019-12-04 | 2020-05-19 | 天津大学 | Attention mechanism-based remote sensing image multi-scale target detection method |
CN111091105B (en) * | 2019-12-23 | 2020-10-20 | 郑州轻工业大学 | Remote sensing image target detection method based on new frame regression loss function |
CN111242061B (en) * | 2020-01-17 | 2021-03-16 | 电子科技大学 | Synthetic aperture radar ship target detection method based on attention mechanism |
CN111368671A (en) * | 2020-02-26 | 2020-07-03 | 电子科技大学 | SAR image ship target detection and identification integrated method based on deep learning |
CN111369543A (en) * | 2020-03-07 | 2020-07-03 | 北京工业大学 | Rapid pollen particle detection algorithm based on dual self-attention module |
CN111401201B (en) * | 2020-03-10 | 2023-06-20 | 南京信息工程大学 | Aerial image multi-scale target detection method based on spatial pyramid attention drive |
CN111563414B (en) * | 2020-04-08 | 2022-03-01 | 西北工业大学 | SAR image ship target detection method based on non-local feature enhancement |
CN111612751B (en) * | 2020-05-13 | 2022-11-15 | 河北工业大学 | Lithium battery defect detection method based on Tiny-yolov3 network embedded with grouping attention module |
CN111797826B (en) * | 2020-05-14 | 2023-04-18 | 中国三峡建设管理有限公司 | Large aggregate concentration area detection method and device and network model training method thereof |
CN111563513B (en) * | 2020-05-15 | 2022-06-24 | 电子科技大学 | Defocus blur detection method based on attention mechanism |
CN111723660A (en) * | 2020-05-18 | 2020-09-29 | 天津大学 | Detection method for long ground target detection network |
CN111563473B (en) * | 2020-05-18 | 2022-03-18 | 电子科技大学 | Remote sensing ship identification method based on dense feature fusion and pixel level attention |
CN111667468A (en) * | 2020-05-28 | 2020-09-15 | 平安科技(深圳)有限公司 | OCT image focus detection method, device and medium based on neural network |
CN111738110A (en) * | 2020-06-10 | 2020-10-02 | 杭州电子科技大学 | Remote sensing image vehicle target detection method based on multi-scale attention mechanism |
CN111723748B (en) * | 2020-06-22 | 2022-04-29 | 电子科技大学 | Infrared remote sensing image ship detection method |
CN111814726B (en) * | 2020-07-20 | 2023-09-22 | 南京工程学院 | Detection method for visual target of detection robot |
CN111914726A (en) * | 2020-07-28 | 2020-11-10 | 联芯智能(南京)科技有限公司 | Pedestrian detection method based on multi-channel self-adaptive attention mechanism |
CN112084868B (en) * | 2020-08-10 | 2022-12-23 | 北京航空航天大学 | Target counting method in remote sensing image based on attention mechanism |
CN111915613B (en) * | 2020-08-11 | 2023-06-13 | 华侨大学 | Image instance segmentation method, device, equipment and storage medium |
CN112101189B (en) * | 2020-09-11 | 2022-09-30 | 北京航空航天大学 | SAR image target detection method and test platform based on attention mechanism |
CN112149591B (en) * | 2020-09-28 | 2022-09-09 | 长沙理工大学 | SSD-AEFF automatic bridge detection method and system for SAR image |
CN112183414A (en) * | 2020-09-29 | 2021-01-05 | 南京信息工程大学 | Weak supervision remote sensing target detection method based on mixed hole convolution |
CN112163580B (en) * | 2020-10-12 | 2022-05-03 | 中国石油大学(华东) | Small target detection algorithm based on attention mechanism |
CN112364754A (en) * | 2020-11-09 | 2021-02-12 | 云南电网有限责任公司迪庆供电局 | Bolt defect detection method and system |
CN112329658B (en) * | 2020-11-10 | 2024-04-02 | 江苏科技大学 | Detection algorithm improvement method for YOLOV3 network |
CN112487900B (en) * | 2020-11-20 | 2022-11-15 | 中国人民解放军战略支援部队航天工程大学 | SAR image ship target detection method based on feature fusion |
CN112464787B (en) * | 2020-11-25 | 2022-07-08 | 北京航空航天大学 | Remote sensing image ship target fine-grained classification method based on spatial fusion attention |
CN112560907A (en) * | 2020-12-02 | 2021-03-26 | 西安电子科技大学 | Limited pixel infrared unmanned aerial vehicle target detection method based on mixed domain attention |
CN112749734B (en) * | 2020-12-29 | 2024-01-05 | 北京环境特性研究所 | Domain-adaptive target detection method based on movable attention mechanism |
CN112784779A (en) * | 2021-01-28 | 2021-05-11 | 武汉大学 | Remote sensing image scene classification method based on feature pyramid multilevel feature fusion |
CN112950546B (en) * | 2021-02-03 | 2023-10-31 | 中南民族大学 | Method and system for detecting esophagus cancer by barium meal radiography image |
CN112560828A (en) * | 2021-02-25 | 2021-03-26 | 佛山科学技术学院 | Lightweight mask face recognition method, system, storage medium and equipment |
CN113408340B (en) * | 2021-05-12 | 2024-03-29 | 北京化工大学 | Dual-polarization SAR small ship detection method based on enhanced feature pyramid |
CN113191374B (en) * | 2021-05-19 | 2023-04-18 | 甘肃省地震局(中国地震局兰州地震研究所) | PolSAR image ridge line extraction method based on pyramid attention network |
CN113379603B (en) * | 2021-06-10 | 2024-03-15 | 大连海事大学 | Ship target detection method based on deep learning |
CN113469088B (en) * | 2021-07-08 | 2023-05-12 | 西安电子科技大学 | SAR image ship target detection method and system under passive interference scene |
CN113569720B (en) * | 2021-07-26 | 2024-03-08 | 南京航空航天大学 | Ship detection method, system and device |
CN113658114A (en) * | 2021-07-29 | 2021-11-16 | 南京理工大学 | Contact net opening pin defect target detection method based on multi-scale cross attention |
CN113673593A (en) * | 2021-08-17 | 2021-11-19 | 辽宁工程技术大学 | Multi-scale feature fusion pedestrian detection method based on attention mechanism |
CN113989665B (en) * | 2021-10-25 | 2023-04-07 | 电子科技大学 | SAR ship detection method based on route aggregation sensing FPN |
CN113822383B (en) * | 2021-11-23 | 2022-03-15 | 北京中超伟业信息安全技术股份有限公司 | Unmanned aerial vehicle detection method and system based on multi-domain attention mechanism |
CN113903007A (en) * | 2021-12-10 | 2022-01-07 | 宁波弘泰水利信息科技有限公司 | Intelligent scene analysis system for water conservancy industry |
CN114418003B (en) * | 2022-01-20 | 2022-09-16 | 北京科技大学 | Double-image recognition and classification method based on attention mechanism and multi-size information extraction |
CN114529825B (en) * | 2022-04-24 | 2022-07-22 | 城云科技(中国)有限公司 | Target detection model, method and application for fire fighting access occupied target detection |
CN115272685B (en) * | 2022-06-21 | 2023-06-06 | 北京科技大学 | Small sample SAR ship target recognition method and device |
CN115359360B (en) * | 2022-10-19 | 2023-04-18 | 福建亿榕信息技术有限公司 | Power field operation scene detection method, system, equipment and storage medium |
CN116385889B (en) * | 2023-06-07 | 2023-09-19 | 国网电力空间技术有限公司 | Railway identification-based power inspection method and device and electronic equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2545661A (en) * | 2015-12-21 | 2017-06-28 | Nokia Technologies Oy | A method for analysing media content |
CN107274401A (en) * | 2017-06-22 | 2017-10-20 | 中国人民解放军海军航空工程学院 | A kind of High Resolution SAR Images Ship Detection of view-based access control model attention mechanism |
CN107358258A (en) * | 2017-07-07 | 2017-11-17 | 西安电子科技大学 | SAR image target classification based on the double CNN passages of NSCT and Selective Attention Mechanism |
CN108038519A (en) * | 2018-01-30 | 2018-05-15 | 浙江大学 | A kind of uterine neck image processing method and device based on dense feature pyramid network |
CN108710830A (en) * | 2018-04-20 | 2018-10-26 | 浙江工商大学 | A kind of intensive human body 3D posture estimation methods for connecting attention pyramid residual error network and equidistantly limiting of combination |
CN109614985A (en) * | 2018-11-06 | 2019-04-12 | 华南理工大学 | A kind of object detection method based on intensive connection features pyramid network |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020154833A1 (en) * | 2001-03-08 | 2002-10-24 | Christof Koch | Computation of intrinsic perceptual saliency in visual environments, and applications |
-
2019
- 2019-04-30 CN CN201910362037.1A patent/CN110084210B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2545661A (en) * | 2015-12-21 | 2017-06-28 | Nokia Technologies Oy | A method for analysing media content |
CN107274401A (en) * | 2017-06-22 | 2017-10-20 | 中国人民解放军海军航空工程学院 | A kind of High Resolution SAR Images Ship Detection of view-based access control model attention mechanism |
CN107358258A (en) * | 2017-07-07 | 2017-11-17 | 西安电子科技大学 | SAR image target classification based on the double CNN passages of NSCT and Selective Attention Mechanism |
CN108038519A (en) * | 2018-01-30 | 2018-05-15 | 浙江大学 | A kind of uterine neck image processing method and device based on dense feature pyramid network |
CN108710830A (en) * | 2018-04-20 | 2018-10-26 | 浙江工商大学 | A kind of intensive human body 3D posture estimation methods for connecting attention pyramid residual error network and equidistantly limiting of combination |
CN109614985A (en) * | 2018-11-06 | 2019-04-12 | 华南理工大学 | A kind of object detection method based on intensive connection features pyramid network |
Non-Patent Citations (6)
Title |
---|
A novel visual attention method for target detection from SAR images;Fei GAO等;《Chinese Journal of Aeronautics》;20190425;第32卷(第8期);1946-1958 * |
Hebbian-based neural networks for bottom-up visual attention and its applications to ship detection in SAR images;Ying Yu等;《Neurocomputing》;20110219;第74卷(第11期);2008-2017 * |
SAR image target detection in complex environments based on improved visual attention algorithm;Shuo Liu等;《EURASIP Journal on Wireless Communications and Networking》;20141231(第(2014)1期);1-8 * |
Simultaneous Ship Detection and Orientation Estimation in SAR Images Based on Attention Module and Angle Regression;Wang, Jizhou等;《SENSORS》;20180829;第18卷(第9期);1-17 * |
光学遥感图像舰船目标检测与识别综述;王彦情;马雷;田原;《自动化学报》;20110930(第(2011)09期);1029-1039 * |
基于多尺度深度网络和视觉注意机制的高分辨SAR图像目标检测与分类;侯瑶淇;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20190215(第(2019)02期);I136-I237 * |
Also Published As
Publication number | Publication date |
---|---|
CN110084210A (en) | 2019-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110084210B (en) | SAR image multi-scale ship detection method based on attention pyramid network | |
CN111753677B (en) | Multi-angle remote sensing ship image target detection method based on characteristic pyramid structure | |
CN112149591B (en) | SSD-AEFF automatic bridge detection method and system for SAR image | |
CN114612769B (en) | Integrated sensing infrared imaging ship detection method integrated with local structure information | |
CN112487912B (en) | Arbitrary direction ship detection method based on improved YOLOv3 | |
CN116343045B (en) | Lightweight SAR image ship target detection method based on YOLO v5 | |
Zhang et al. | Nearshore vessel detection based on Scene-mask R-CNN in remote sensing image | |
Xu et al. | On-board ship detection in SAR images based on L-YOLO | |
CN116168240A (en) | Arbitrary-direction dense ship target detection method based on attention enhancement | |
Kong et al. | Lightweight algorithm for multi-scale ship detection based on high-resolution SAR images | |
Liu et al. | Target detection and tracking algorithm based on improved Mask RCNN and LMB | |
Zhou et al. | Automatic ship detection in SAR Image based on Multi-scale Faster R-CNN | |
Zhang et al. | Surface defect detection of wind turbine based on lightweight YOLOv5s model | |
CN116188944A (en) | Infrared dim target detection method based on Swin-transducer and multi-scale feature fusion | |
Zhang et al. | Swin-PAFF: A SAR Ship Detection Network with Contextual Cross-Information Fusion. | |
Chen et al. | SAR ship detection under complex background based on attention mechanism | |
CN112800932B (en) | Method for detecting remarkable ship target in offshore background and electronic equipment | |
Zhang et al. | Feature enhanced centernet for object detection in remote sensing images | |
Yang et al. | Double feature pyramid networks for classification and localization on object detection | |
Peng | Computer Information Technology and Network Security Analysis of Intelligent Image Recognition | |
CN110222632A (en) | A kind of waterborne target detection method of gray prediction auxiliary area suggestion | |
CN110135239A (en) | A kind of recognition methods of optical remote sensing image harbour Ship Target | |
Zhao et al. | Optical and SAR remote sensing image ship detection based on attention mechanism | |
CN113837080B (en) | Small target detection method based on information enhancement and receptive field enhancement | |
CN115471729B (en) | Ship target identification method and system based on improved YOLOv5 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |