CN112163504A - Remote sensing image small sample ship target identification method based on structure chart convolutional network - Google Patents

Remote sensing image small sample ship target identification method based on structure chart convolutional network Download PDF

Info

Publication number
CN112163504A
CN112163504A CN202011017557.8A CN202011017557A CN112163504A CN 112163504 A CN112163504 A CN 112163504A CN 202011017557 A CN202011017557 A CN 202011017557A CN 112163504 A CN112163504 A CN 112163504A
Authority
CN
China
Prior art keywords
layer
graph
ship
node
nodes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011017557.8A
Other languages
Chinese (zh)
Other versions
CN112163504B (en
Inventor
陈华杰
吕丹妮
韦玉潭
吴栋
白浩然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202011017557.8A priority Critical patent/CN112163504B/en
Publication of CN112163504A publication Critical patent/CN112163504A/en
Application granted granted Critical
Publication of CN112163504B publication Critical patent/CN112163504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a remote sensing image small sample ship target identification method based on a structure graph convolutional network. The traditional target identification is increased from the pixel layer to the component layer, so that the limitation of the existing small sample target identification method frame is broken through, and the accuracy and the speed of the small sample target identification are improved.

Description

Remote sensing image small sample ship target identification method based on structure chart convolutional network
Technical Field
The invention relates to the field of target identification, in particular to a remote sensing image small sample ship target identification method based on a structure diagram convolutional network.
Background
The small sample learning is a hot problem in the current machine learning field, and the core thought of the small sample learning is to extract extra characteristic information from related data or tasks, so that the generalization capability of the small sample learning is improved. For small sample target identification, the current small sample learning has the following limitations: the method comprises the following steps: 1) constraint conditions on related data are strict, and migration learning is taken as an example, the migration learning has strict limitation on a source domain, and generally requires that the source domain and a target domain are consistent in category; 2) the introduction of the extra feature information has a risk, and the task may be adversely affected by extracting the improper extra feature information.
Existing graph convolutional neural networks (GCNs) are classified into two types, spectral methods, which define graph convolution from a spectral domain using the convolution theorem on a graph, and spatial methods, which aggregate each central node and its neighboring nodes by defining an aggregation function starting from a node domain. The graph convolution network adopts a graph structure to model non-Euclidean structure data, adopts graph convolution operation and a hierarchical network structure to mine characteristic information contained in the data, and shows strong potential in successful application of point cloud modeling, scene graph modeling and other related computer vision tasks.
Disclosure of Invention
Aiming at the defects of the prior art, the invention introduces the structural element of the middle layer on the basis of the traditional small sample target identification, and tries to break through the limitation of the existing small sample target identification method; and simultaneously, modeling and feature extraction are carried out on the structural elements by using the graph convolution network framework. Through the organic combination of a small sample learning technology and a graph convolution network technology, a remote sensing image small sample ship target identification method based on a structure graph convolution network is provided.
The invention relates to a remote sensing image small sample ship target identification method based on a structure chart convolutional network, which specifically comprises the following steps:
step (1), graph modeling of structural elements
1.1 definition of structural elements
Defining the structural elements: a) a ship profile; b) key components of an upper layer of a ship; the key components on the upper layer of the ship comprise a helicopter apron, artillery/missile equipment, antenna/radar equipment, a fish emitter and a stern platform;
1.2 graph modeling of structural elements
Whether the key components on the ship upper layer are easy to detect or not is considered when the key components on the ship upper layer are selected, so that the edge of the ship and the key components on the ship upper layer are detected in the early stage to give coordinates (x, y), and then partial coordinates representing the characteristics are selected as the node information of the characteristic key point generation diagram;
selecting a non-directional edge without weight for modeling, wherein the edge construction idea is to orderly connect key points of the edge of the ship and construct a model of edge characteristics; then orderly connecting key points of key components on the upper layer of the ship to construct a model of the upper-layer building characteristics; finally, connecting the key points of the bow with each key point of the key components on the upper layer of the ship; taking the key points as nodes of the graph;
step (2) design of graph network
2.1 graph data construction
For graph data D { (G1, y1), (G2, y2) · G ═ a, X, where G denotes a complete graph of a ship and y denotes class labels of the complete graph of a ship, where there are N nodes each having its own feature, here coordinates (X, y), and assuming that the features of these nodes form a matrix X of dimension N × D, then the relationships between the nodes also form a matrix a of dimension N × N, called adjacency matrix; y1, Y2 form a label set Y corresponding to the graph, and then through a mapping function: f: g → Y, the graph structure can be mapped to the corresponding label;
2.2 sampling neighbor nodes
The representation of the node of each convolution layer is generated by the previous layer and is irrelevant to other nodes of the current layer, and the method is also a layer-based sampling mode; the method specifically comprises the following steps: the 1 neighborhood of each node is divided into a subset in the 0 th layer, the 1 st neighborhood of each node in the 1 st layer is represented and aggregated with the 1 st neighborhood information of the node in the 0 th layer, and the 1 st neighborhood information in the 1 st layer is aggregated again in the 2 nd layer, namely the second-order neighbor nodes of the 0 th layer are expanded; therefore, when aggregation is carried out, aggregation is carried out for K times, and the K-order neighbor can be expanded;
2.3 plan convolution operator
For graph data D, the propagation modes between layers are:
Figure BDA0002699582460000021
namely:
Figure BDA0002699582460000022
wherein:
Figure BDA0002699582460000023
i is an identity matrix;
Figure BDA0002699582460000024
is that
Figure BDA0002699582460000025
A degree matrix of (c); h(l)Is a characteristic of the l-th layer, for the input layer, H is X; w(l)Is the weight matrix of the l-th layer; σ is a nonlinear activation function;
the graph network GCN inputs a graph, the characteristic of each node is changed from X to Z through a plurality of layers GCN, but the connection relation between the nodes, namely A, is shared no matter how many layers are in the middle; constructing a two-layer GCN, wherein the activating functions respectively adopt ReLU and Softmax, and the overall forward propagation formula is as follows:
Figure BDA0002699582460000031
step (3) small sample ship identification
The identification network adopts a layered solution structure: the multilayer neural network calculates the representation of each sample, then calculates the probability of each category through representation, and finally calculates the gradient through backward propagation;
3.1 obtaining the representation of each node in the graph, namely the node characteristic Z, through the graph convolutional layer;
3.2 obtaining the representation of each graph by using Readout, specifically, taking the average value of all node characteristics of the graph and inputting the average value into a classifier;
3.3 Cross Encopy Loss was used as a Loss function;
Figure BDA0002699582460000032
wherein Y is a set of labels, YlfIs the label category, ZlfIs the prediction category, F is the number of features of the node, l is the number of convolution layers;
finally, training the weight W of the neural network by a gradient descent method(l)The probability for each category is calculated using Softmax.
Compared with the prior art, the invention has the following effects: an intermediate layer structural element is introduced on the basis of the traditional small sample target identification, and a graph convolution network frame is used for modeling and feature extraction of the structural element, so that a remote sensing image small sample ship target identification method based on a structure graph convolution network is provided. The traditional target identification is increased from the pixel layer to the component layer, so that the limitation of the existing small sample target identification method frame is broken through, and the accuracy and the speed of the small sample target identification are improved.
Drawings
FIG. 1 is an overall flow chart of the present invention;
FIG. 2 is a ship map model;
FIG. 3 is a schematic diagram of a sampling neighbor node;
Detailed Description
As shown in fig. 1, the remote sensing image small sample ship target identification method (SGCN) based on the Structure Graph Convolutional Network of the present invention includes the following specific steps:
step (1), graph modeling of structural elements
1.3 definition of structural elements
Comprehensively considering the requirements of universality, distinctiveness and significance, defining the following structural elements: a) a ship profile; b) an upper key component of a ship.
Taking the profile as an example: the ship is a rigid object, and particularly under a remote sensing overlooking angle, the outline of the same ship is relatively stable, so that the requirement on universality is met; the ships with different ship types and ship grades have differences on local profiles, and the distinguishing requirement is met; the contour of the ship is positioned at the boundary of the ship and a water area, and has more remarkable characteristics on image representation.
The specific definition of the ship upper-layer key components requires the intervention of human prior knowledge, and the defined key components comprise the following components through structural analysis of the ship: helicopter tarmac, artillery/missile equipment, antenna/radar equipment, fish launchers, stern platforms, and the like.
1.4 graph modeling of structural elements
Whether the important components of the ship are easy to detect is considered when the important components of the ship are selected, so that the edge of the ship and the important part of an upper building are detected in the early stage to give coordinates (x, y), and then part of the coordinates which can represent the characteristics are selected as characteristic key points to generate node information of the graph.
There are several edges in the graph network, namely a non-directional edge, a directional edge and a weighted non-directional or directional edge. By analyzing the characteristics of the ship under study and the problems to be solved, the invention selects the undirected edges without weights to carry out modeling. The construction idea of the edge is to orderly connect the key points of the edge of the ship and construct a model of the edge characteristic. And then orderly connecting the key points of the superstructure to construct a model of the superstructure characteristics. And finally, connecting key points of the ship head with each key point of the superstructure, wherein the step is to integrate the ship edge with the key points of the superstructure of the ship so as to generate a complete graph, and the graph is shown in figure 2.
Step (2) design of graph network
For graph data D { (G1, y1), (G2, y2) }, G ═ a, X, where there are N nodes (nodes), each node having its own features, here coordinates (X, y), the features of these nodes are assumed to form a matrix X of dimension N × D, and then the relationships between the nodes also form a matrix a of dimension N × N, called adjacency matrix. Y1, Y2, etc. form a label set Y corresponding to a graph, and then through a mapping function: f: g → Y, the graph structure can be mapped to the corresponding label.
2.1 sampling neighbor nodes
As shown in fig. 3, the representation of the node (node) of each layer is generated by the previous layer, independent of other nodes of the layer, which is also a layer-based sampling method. The method specifically comprises the following steps: the 1 neighborhood of each node is divided into a subset in the 0 th layer, the 1 st neighborhood of each node in the 1 st layer is represented and aggregated with the 1 st neighborhood information of the node in the 0 th layer, and the 1 st neighborhood information in the 1 st layer is aggregated again in the 2 nd layer, namely the second-order neighbor nodes of the 0 th layer are expanded; therefore, when aggregation is carried out, aggregation is carried out for K times, and the K-order neighbor can be expanded;
2.2 graph convolution operator design
For graph data D, there are N nodes (nodes), each node has its own characteristics, and the characteristics of the nodes are configured to form an N × D matrix X, and then the relationship between the nodes also forms an N × N matrix a, also called an adjacency matrix (adjacency matrix). X and A are the inputs to the model. The propagation mode between layers is as follows:
Figure BDA0002699582460000051
namely:
Figure BDA0002699582460000052
wherein:
Figure BDA0002699582460000053
i is an identity matrix;
Figure BDA0002699582460000054
is that
Figure BDA0002699582460000055
A degree matrix of (c); h is a characteristic of each layer, and H is X for the input layer; w is a weight matrix; σ is a nonlinear activation function.
The GCN inputs a graph, and the characteristics of each node change from X to Z through several layers of GCN, but the connection relationship between the nodes, i.e. a, is shared no matter how many layers there are in between. We construct a two-layer GCN, and the activating functions respectively use ReLU and Softmax, so the overall forward propagation formula is:
Figure BDA0002699582460000056
step (3) small sample ship identification
The identification network of the invention adopts a layered decomposition structure: the multi-layer neural network calculates the representation of each sample, then calculates the probability of each class through the representation, and finally calculates the gradient through backward propagation.
3.1 obtaining a representation of each node in the graph, i.e. the node feature Z, by the graph convolution layer.
3.2 use the "read" operation (Readout) to obtain a representation of each graph, by taking the average of all the node features of the graph and then inputting them into the classifier.
3.3 Cross Encopy Loss was used as a Loss function.
Figure BDA0002699582460000057
Wherein Y is a set of labels, YlfIs the label category, ZlfIs the prediction class, F is the number of features of the node, and l is the number of convolution layers.
Finally, training the weight W of the neural network by a gradient descent method(l)The probability for each category is calculated using Softmax.

Claims (1)

1. A remote sensing image small sample ship target identification method based on a structure chart convolutional network is characterized by specifically comprising the following steps:
step (1), graph modeling of structural elements
1.1 definition of structural elements
Defining the structural elements: a) a ship profile; b) key components of an upper layer of a ship; the key components on the upper layer of the ship comprise a helicopter apron, artillery/missile equipment, antenna/radar equipment, a fish emitter and a stern platform;
1.2 graph modeling of structural elements
Whether the key components on the ship upper layer are easy to detect or not is considered when the key components on the ship upper layer are selected, so that the edge of the ship and the key components on the ship upper layer are detected in the early stage to give coordinates (x, y), and then partial coordinates representing the characteristics are selected as the node information of the characteristic key point generation diagram;
selecting a non-directional edge without weight for modeling, wherein the edge construction idea is to orderly connect key points of the edge of the ship and construct a model of edge characteristics; then orderly connecting key points of key components on the upper layer of the ship to construct a model of the upper-layer building characteristics; finally, connecting the key points of the bow with each key point of the key components on the upper layer of the ship; taking the key points as nodes of the graph;
step (2) design of graph network
2.1 graph data construction
For graph data D { (G1, y1), (G2, y2) · G ═ a, X, where G denotes a complete graph of a ship and y denotes class labels of the complete graph of a ship, where there are N nodes each having its own feature, here coordinates (X, y), and assuming that the features of these nodes form a matrix X of dimension N × D, then the relationships between the nodes also form a matrix a of dimension N × N, called adjacency matrix; y1, Y2 form a label set Y corresponding to the graph, and then through a mapping function: f: g → Y, the graph structure can be mapped to the corresponding label;
2.2 sampling neighbor nodes
The representation of the node of each convolution layer is generated by the previous layer and is irrelevant to other nodes of the current layer, and the method is also a layer-based sampling mode; the method specifically comprises the following steps: the 1 neighborhood of each node is divided into a subset in the 0 th layer, the 1 st neighborhood of each node in the 1 st layer is represented and aggregated with the 1 st neighborhood information of the node in the 0 th layer, and the 1 st neighborhood information in the 1 st layer is aggregated again in the 2 nd layer, namely the second-order neighbor nodes of the 0 th layer are expanded; therefore, when aggregation is carried out, aggregation is carried out for K times, and the K-order neighbor can be expanded;
2.3 plan convolution operator
For graph data D, the propagation modes between layers are:
Figure FDA0002699582450000021
namely:
Figure FDA0002699582450000022
wherein:
Figure FDA0002699582450000023
i is an identity matrix;
Figure FDA0002699582450000024
is that
Figure FDA0002699582450000025
A degree matrix of (c); h(l)Is a characteristic of the l-th layer, for the input layer, H is X; w(l)Is the weight matrix of the l-th layer; σ is a nonlinear activation function;
the graph network GCN inputs a graph, the characteristic of each node is changed from X to Z through a plurality of layers GCN, but the connection relation between the nodes, namely A, is shared no matter how many layers are in the middle; constructing a two-layer GCN, wherein the activating functions respectively adopt ReLU and Softmax, and the overall forward propagation formula is as follows:
Figure FDA0002699582450000026
step (3) small sample ship identification
The identification network adopts a layered solution structure: the multilayer neural network calculates the representation of each sample, then calculates the probability of each category through representation, and finally calculates the gradient through backward propagation;
3.1 obtaining the representation of each node in the graph, namely the node characteristic Z, through the graph convolutional layer;
3.2 obtaining the representation of each graph by using Readout, specifically, taking the average value of all node characteristics of the graph and inputting the average value into a classifier;
3.3 Cross Encopy Loss was used as a Loss function;
Figure FDA0002699582450000027
wherein Y is a set of labels, YlfIs the label category, ZlfIs the prediction category, F is the number of features of the node, l is the number of convolution layers;
finally, training the weight W of the neural network by a gradient descent method(l)The probability for each category is calculated using Softmax.
CN202011017557.8A 2020-09-24 2020-09-24 Remote sensing image small sample ship target identification method based on structural diagram convolutional network Active CN112163504B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011017557.8A CN112163504B (en) 2020-09-24 2020-09-24 Remote sensing image small sample ship target identification method based on structural diagram convolutional network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011017557.8A CN112163504B (en) 2020-09-24 2020-09-24 Remote sensing image small sample ship target identification method based on structural diagram convolutional network

Publications (2)

Publication Number Publication Date
CN112163504A true CN112163504A (en) 2021-01-01
CN112163504B CN112163504B (en) 2024-02-20

Family

ID=73863845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011017557.8A Active CN112163504B (en) 2020-09-24 2020-09-24 Remote sensing image small sample ship target identification method based on structural diagram convolutional network

Country Status (1)

Country Link
CN (1) CN112163504B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269647A (en) * 2021-06-08 2021-08-17 上海交通大学 Graph-based transaction abnormity associated user detection method
CN116721301A (en) * 2023-08-10 2023-09-08 中国地质大学(武汉) Training method, classifying method, device and storage medium for target scene classifying model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109359557A (en) * 2018-09-25 2019-02-19 东北大学 A kind of SAR remote sensing images Ship Detection based on transfer learning
US20200285944A1 (en) * 2019-03-08 2020-09-10 Adobe Inc. Graph convolutional networks with motif-based attention

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109359557A (en) * 2018-09-25 2019-02-19 东北大学 A kind of SAR remote sensing images Ship Detection based on transfer learning
US20200285944A1 (en) * 2019-03-08 2020-09-10 Adobe Inc. Graph convolutional networks with motif-based attention

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘晨;曲长文;周强;李智;李健伟;: "基于卷积神经网络迁移学习的SAR图像目标分类", 现代雷达, no. 03, 15 March 2018 (2018-03-15) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269647A (en) * 2021-06-08 2021-08-17 上海交通大学 Graph-based transaction abnormity associated user detection method
CN116721301A (en) * 2023-08-10 2023-09-08 中国地质大学(武汉) Training method, classifying method, device and storage medium for target scene classifying model
CN116721301B (en) * 2023-08-10 2023-10-24 中国地质大学(武汉) Training method, classifying method, device and storage medium for target scene classifying model

Also Published As

Publication number Publication date
CN112163504B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
US11830246B2 (en) Systems and methods for extracting and vectorizing features of satellite imagery
CN107871119B (en) Target detection method based on target space knowledge and two-stage prediction learning
CN110929697B (en) Neural network target identification method and system based on residual error structure
CN110929607B (en) Remote sensing identification method and system for urban building construction progress
CN107092870B (en) A kind of high resolution image Semantic features extraction method
CN107194336B (en) Polarized SAR image classification method based on semi-supervised depth distance measurement network
CN104392228B (en) Unmanned plane image object class detection method based on conditional random field models
CN109584248A (en) Infrared surface object instance dividing method based on Fusion Features and dense connection network
CN109886286A (en) Object detection method, target detection model and system based on cascade detectors
CN112132818B (en) Pulmonary nodule detection and clinical analysis method constructed based on graph convolution neural network
CN107818302A (en) Non-rigid multiple dimensioned object detecting method based on convolutional neural networks
CN108564029A (en) Face character recognition methods based on cascade multi-task learning deep neural network
CN107909015A (en) Hyperspectral image classification method based on convolutional neural networks and empty spectrum information fusion
CN109978918A (en) A kind of trajectory track method, apparatus and storage medium
CN109919085B (en) Human-human interaction behavior identification method based on light-weight convolutional neural network
CN112381060B (en) Building earthquake damage level classification method based on deep learning
CN107016413A (en) A kind of online stage division of tobacco leaf based on deep learning algorithm
CN112488025B (en) Double-temporal remote sensing image semantic change detection method based on multi-modal feature fusion
CN111738268A (en) Semantic segmentation method and system for high-resolution remote sensing image based on random block
CN112163504A (en) Remote sensing image small sample ship target identification method based on structure chart convolutional network
CN109919246A (en) Pedestrian's recognition methods again based on self-adaptive features cluster and multiple risks fusion
CN111242227A (en) Multi-modal foundation cloud identification method based on heterogeneous depth features
CN114330516A (en) Small sample logo image classification based on multi-graph guided neural network model
CN107766810B (en) Cloud and shadow detection method
CN115512247A (en) Regional building damage grade assessment method based on image multi-parameter extraction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant