CN113837046A - Small sample remote sensing image scene classification method based on iterative feature distribution learning - Google Patents

Small sample remote sensing image scene classification method based on iterative feature distribution learning Download PDF

Info

Publication number
CN113837046A
CN113837046A CN202111089674.XA CN202111089674A CN113837046A CN 113837046 A CN113837046 A CN 113837046A CN 202111089674 A CN202111089674 A CN 202111089674A CN 113837046 A CN113837046 A CN 113837046A
Authority
CN
China
Prior art keywords
feature
learning
matrix
small sample
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111089674.XA
Other languages
Chinese (zh)
Inventor
耿杰
曾庆捷
蒋雯
邓鑫洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202111089674.XA priority Critical patent/CN113837046A/en
Publication of CN113837046A publication Critical patent/CN113837046A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a small sample remote sensing image scene classification method based on iterative feature distribution learning, which comprises the following steps of: step one, setting an iterative feature distribution learning network framework; generating a similarity matrix between the image samples; step three, training the small sample classifier by using guiding knowledge based on the classification prediction probability and the similarity matrix; and step four, correcting the characteristic distribution by using the prediction probability distribution matrix and the attention weight, and then iteratively updating the whole network. The invention has simple structure and reasonable design, firstly generates the similarity matrix to obtain the characteristic correlation among samples, secondly trains the classifier under the guidance of historical classification prediction, finally corrects the characteristics by adopting an attention mechanism, and re-inputs the corrected characteristics into the network to realize the iterative update of the whole network. The method can further improve the characterization capability of the category through iterative feature distribution learning, thereby improving the accuracy of small sample classification.

Description

Small sample remote sensing image scene classification method based on iterative feature distribution learning
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a small sample remote sensing image scene classification method based on iterative feature distribution learning.
Background
With the development of artificial intelligence technology, deep learning can achieve satisfactory results in some scenarios on computer vision tasks by means of deep and complex network models, huge training data support and strong hardware support. However, for a small amount of (single) sample learning tasks, deep learning cannot improve learning ability from the aspects of complex network models and a large amount of training data. Therefore, small sample learning techniques that learn data patterns from a small number of (single) samples are currently the direction of intense research in the field of deep learning.
Small sample learning aims at achieving the ability to identify unknown classes with a small number of training samples, which is similar to the process of the human brain to link and reason about unknown things based on a priori knowledge. The core of small sample learning is to classify unlabeled samples correctly in the case of few labeled samples.
The small sample remote sensing image scene classification technology based on deep learning can realize high-precision classification of label-free images under the condition of limited labeling data. Therefore, the small sample remote sensing image scene classification technology has a huge application prospect in the applications of earth environment monitoring, ground target classification, rare animal classification and the like.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a small sample remote sensing image scene classification method based on iterative feature distribution learning, which is simple in structure and reasonable in design, aiming at the problems that when the number of samples is insufficient, a depth network model is easy to over-fit and the classification performance is poor. The overall design idea is as follows: firstly generating a similarity matrix to obtain characteristic correlation among samples, secondly training a classifier under the guidance of historical classification prediction, finally correcting the characteristics by adopting an attention mechanism, and inputting the corrected characteristics into the network again to realize iterative update of the whole network. Through iterative feature distribution learning, the characterization capability of the category can be further improved, and therefore the accuracy of small sample classification is improved.
In order to solve the technical problems, the invention adopts the technical scheme that: a small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: the method comprises the following steps:
step one, setting an iterative feature distribution learning network framework:
step 101, setting a network framework, including a feature extractor F and a relation measurement model FRClassifier fLAttention model fAFeature learning model fC
Step 102, setting the learning rate and the iteration times of a network;
step two, generating a similarity matrix between the image samples:
step 201, inputting an image set X composed of a support set and a query set into a feature extractor F to obtain a feature Fk(X);
Step 202, feature Fk(X) expanding in a first dimension to obtain a feature Fi k(X);
Step 203, based on the feature Fi k(X) exchanging the first dimension and the second dimension to obtain the characteristics
Figure BDA0003266945510000031
Step 204, calculating a relation matrix
Figure BDA0003266945510000032
And similarity matrix
Figure BDA0003266945510000033
Step three, training the small sample classifier by using guiding knowledge based on the classification prediction probability and the similarity matrix:
step 301, feature Fk(X) inputting the data into a classifier to obtain a classification prediction probability
Figure BDA0003266945510000034
Step 302, predict probability of classification
Figure BDA0003266945510000035
Similarity matrix with next iteration
Figure BDA0003266945510000036
Stacking in a second dimension as a guide knowledge;
step 303, input the stacking result into the classifier, using the focus loss LFLOptimizing the classifier;
step four, correcting the characteristic distribution by using the prediction probability distribution matrix and the attention weight, and then iteratively updating the whole network:
step 401, predict probability of classification
Figure BDA0003266945510000037
Expanding on the first dimension to obtain a probability matrix Pi k(X);
Step 402, based on the probability matrix Pi k(X) exchanging the first dimension and the second dimension to obtain a matrix
Figure BDA0003266945510000038
Step 403, calculating a prediction probability distribution matrix
Figure BDA0003266945510000039
And attention weight
Figure BDA00032669455100000310
Thereby obtaining corrected features Fk+1(X);
And step 404, starting from step 202 in the step two, repeating the process, and obtaining a final classification result after the specified iteration times.
The small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that:the relationship matrix in step 204
Figure BDA0003266945510000041
The calculation formula of (2) is as follows:
Figure BDA0003266945510000042
similarity matrix
Figure BDA0003266945510000043
The calculation formula of (2) is as follows:
Figure BDA0003266945510000044
where k denotes the kth iteration, FR[·]Representing a relational metric function.
The small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: classification of prediction probabilities in step 301
Figure BDA0003266945510000045
The calculation formula of (2) is as follows:
Figure BDA0003266945510000046
where Cat (-) denotes stacking on channels, OneHot (-) denotes unique Hot code, fL[·]Representing a classifier function, y is the true label corresponding to the support set sample, PinitRepresenting the initialized query set sample probability scores.
The small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: loss of coke L in step 303FLThe calculation formula of (2) is as follows:
Figure BDA0003266945510000047
wherein
Figure BDA0003266945510000048
The method comprises the steps of classifying and predicting probability, y is a sample real label, alpha represents a parameter for adjusting loss weight, and gamma represents an adjusting factor for adjusting the descending rate of the sample weight.
A base as described aboveThe small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized by comprising the following steps of: probability distribution matrix in step 403
Figure BDA0003266945510000049
The calculation formula of (2) is as follows:
Figure BDA00032669455100000410
attention weight
Figure BDA00032669455100000411
The calculation formula of (2) is as follows:
Figure BDA00032669455100000412
correction feature Fk+1The calculation formula of (X) is:
Figure BDA0003266945510000051
wherein f isA[·]Representing the attention model function, fC{. denotes a feature learning model function.
The small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: relationship metric model FRThe network structure of (a) is as follows: input layer → first convolution layer → second convolution layer → third convolution layer; classifier fLThe network structure of (a) is as follows: input layer → first fully connected layer → second fully connected layer; attention model fAThe network structure of (a) is as follows: input layer → first convolution layer → second convolution layer → third convolution layer; feature learning model fCThe network structure of (a) is as follows: input layer → first convolutional layer → second convolutional layer.
The small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: the convolutional layers are all based on one basic convolution unit of a neural network VGG, ResNet, GoogleNet or AlexNet.
Compared with the prior art, the invention has the following advantages:
1. in the method, a characteristic distribution learning mode is adopted, the characteristic association among the samples is firstly obtained, then the classifier is optimized, and then the sample characteristic is corrected by adopting an attention mechanism, so that the effective information of the image target can be strengthened, the characterization capability of the category is enhanced, and the classification accuracy of the small samples is further improved.
2. The invention has simple structure and reasonable design, adopts the model architecture of loop iteration, and continuously adjusts the model network parameters better by optimizing the characteristics, thereby effectively improving the convergence speed of the model.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
As shown in fig. 1, the method for classifying scenes of small sample remote sensing images based on iterative feature distribution learning of the present invention includes the following steps:
step one, setting an iterative feature distribution learning network framework:
step 101, setting a network framework, including a feature extractor F and a relation measurement model FRClassifier fLAttention model fAFeature learning model fC
Step 102, setting the learning rate of the network to be 0.001 and the iteration number to be 1000;
in the small sample remote sensing image scene classification method based on iterative feature distribution learning, it needs to be explained that a relation measurement model FRNetwork structure ofSequentially comprises the following steps: input layer → first convolution layer → second convolution layer → third convolution layer; classifier fLThe network structure of (a) is as follows: input layer → first fully connected layer → second fully connected layer; attention model fAThe network structure of (a) is as follows: input layer → first convolution layer → second convolution layer → third convolution layer; feature learning model fCThe network structure of (a) is as follows: input layer → first convolutional layer → second convolutional layer.
In the method for classifying the small sample remote sensing image scene based on iterative feature distribution learning, it should be noted that the convolution layers are all basic convolution units based on one of neural networks VGG, ResNet, GoogleNet or AlexNet.
Step two, generating a similarity matrix between the image samples:
step 201, inputting an image set X composed of a support set and a query set into a feature extractor F to obtain a feature Fk(X); for each batch of tasks, the dimension of the image set X is (10, 3, 84, 84), and after the feature extractor F, the data dimension becomes (10, 128, 5, 5), i.e., Fk(X) dimension (10, 128, 5, 5);
step 202, feature Fk(X) expanding in a first dimension to obtain a feature Fi k(X),Fi k(X) dimension (1, 10, 128, 5, 5);
step 203, based on the feature Fi k(X) exchanging the first dimension and the second dimension to obtain the characteristics
Figure BDA0003266945510000081
Dimension (10, 1, 128, 5, 5);
step 204, calculating a relation matrix
Figure BDA0003266945510000082
And similarity matrix
Figure BDA0003266945510000083
Relationship matrix
Figure BDA0003266945510000084
The calculation formula of (2) is as follows:
Figure BDA0003266945510000085
similarity matrix
Figure BDA0003266945510000086
The calculation formula of (2) is as follows:
Figure BDA0003266945510000091
where k denotes the kth iteration, FR[·]Representing a relational metric function.
Step three, training the small sample classifier by using guiding knowledge based on the classification prediction probability and the similarity matrix:
step 301, feature Fk(X) inputting the data into a classifier to obtain a classification prediction probability
Figure BDA0003266945510000092
The calculation formula is as follows:
Figure BDA0003266945510000093
where Cat (-) denotes stacking on channels, OneHot (-) denotes unique Hot code, fL[·]Representing a classifier function, y is the true label corresponding to the support set sample, PinitRepresenting an initialized query set sample probability score;
step 302, predict probability of classification
Figure BDA0003266945510000094
Similarity matrix with next iteration
Figure BDA0003266945510000095
Stacking in a second dimension as a guide knowledge;
step 303, input the stacking result into the classifier, using the focus loss LFLOptimization of the classifier, LFLThe calculation formula of (2) is as follows:
Figure BDA0003266945510000096
wherein
Figure BDA0003266945510000097
The method comprises the steps of classifying and predicting probability, y is a sample real label, alpha represents a parameter for adjusting loss weight, and gamma represents an adjusting factor for adjusting the descending rate of the sample weight.
Step four, correcting the characteristic distribution by using the prediction probability distribution matrix and the attention weight, and then iteratively updating the whole network:
step 401, predict probability of classification
Figure BDA0003266945510000098
Expanding on the first dimension to obtain a probability matrix Pi k(X),
Figure BDA0003266945510000099
Is (10, 5), Pi kThe dimension of (X) is (1, 10, 5);
step 402, based on the probability matrix Pi k(X) exchanging the first dimension and the second dimension to obtain a matrix
Figure BDA0003266945510000101
Is (10, 1, 5);
step 403, calculating a prediction probability distribution matrix
Figure BDA0003266945510000102
And attention weight
Figure BDA0003266945510000103
Thereby obtaining corrected features Fk+1(X), probability distribution matrix
Figure BDA0003266945510000104
The calculation formula of (2) is as follows:
Figure BDA0003266945510000105
attention weight
Figure BDA0003266945510000106
The calculation formula of (2) is as follows:
Figure BDA0003266945510000107
correction feature Fk+1The calculation formula of (X) is:
Figure BDA0003266945510000108
wherein f isA[·]Representing the attention model function, fC{. represents a feature learning model function;
and step 404, starting from step 202 in the step two, repeating the process, and obtaining a final classification result after the specified iteration times.
The technical effects of the invention are explained by simulation experiments as follows:
1. simulation conditions and content
The experimental data of the invention is a remote sensing image data set NWPU-RESISC45 collected by northwest industrial university, the image size is 256 multiplied by 256, 45 types of different ground features, and the total number is 31500 images. Table one is the comparison of the accuracy of the existing small sample classification technique. In simulation experiments, the invention and the comparison method are both implemented in Python3.7 by programming.
2. Analysis of simulation results
Table-small sample classification accuracy comparison
Figure BDA0003266945510000109
Figure BDA0003266945510000111
As can be seen from the table I, compared with the existing model based on the measurement method and the model based on the gradient method, the method provided by the invention has higher classification precision, and the excellent effect of the method provided by the invention on classifying the remote sensing images of small samples is proved. According to the 5-way 1-shot and 5-way 5-shot experimental results of the small sample classification standard, the method disclosed by the invention can effectively improve the classification effect of the small sample remote sensing image.
The above embodiments are only examples of the present invention, and are not intended to limit the present invention, and all simple modifications, changes and equivalent structural changes made to the above embodiments according to the technical spirit of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (7)

1. A small sample remote sensing image scene classification method based on iterative feature distribution learning is characterized in that: the method comprises the following steps:
step one, setting an iterative feature distribution learning network framework:
step 101, setting a network framework, including a feature extractor F and a relation measurement model FRClassifier fLAttention model fAFeature learning model fC
Step 102, setting the learning rate and the iteration times of a network;
step two, generating a similarity matrix between the image samples:
step 201, inputting an image set X composed of a support set and a query set into a feature extractor F to obtain a feature Fk(X);
Step 202, feature Fk(X) expanding in a first dimension to obtain a feature Fi k(X);
Step 203, based on the feature Fi k(X) exchanging the first dimension and the second dimension to obtain the characteristics
Figure FDA0003266945500000011
Step 204, calculating a relation matrix
Figure FDA0003266945500000012
And similarity matrix
Figure FDA0003266945500000013
Step three, training the small sample classifier by using guiding knowledge based on the classification prediction probability and the similarity matrix:
step 301, feature Fk(X) inputting the data into a classifier to obtain a classification prediction probability
Figure FDA0003266945500000014
Step 302, predict probability of classification
Figure FDA0003266945500000015
Similarity matrix with next iteration
Figure FDA0003266945500000016
Stacking in a second dimension as a guide knowledge;
step 303, input the stacking result into the classifier, using the focus loss LFLOptimizing the classifier;
step four, correcting the characteristic distribution by using the prediction probability distribution matrix and the attention weight, and then iteratively updating the whole network:
step 401, predict probability of classification
Figure FDA0003266945500000021
Expanding on the first dimension to obtain a probability matrix Pi k(X);
Step 402, based on the probability matrix Pi k(X) exchanging the first dimension and the second dimension to obtain a matrix
Figure FDA0003266945500000022
Step 403, calculating a prediction probability distribution matrix
Figure FDA0003266945500000023
And attention weight
Figure FDA0003266945500000024
Thereby obtaining corrected features Fk+1(X);
And step 404, starting from step 202 in the step two, repeating the process, and obtaining a final classification result after the specified iteration times.
2. The method for classifying scenes of the small sample remote sensing images based on the iterative feature distribution learning as claimed in claim 1, wherein: the relationship matrix in step 204
Figure FDA0003266945500000025
The calculation formula of (2) is as follows:
Figure FDA0003266945500000026
similarity matrix
Figure FDA0003266945500000027
The calculation formula of (2) is as follows:
Figure FDA0003266945500000028
where k denotes the kth iteration, FR[·]Representing a relational metric function.
3. The method for classifying scenes of the small sample remote sensing images based on the iterative feature distribution learning as claimed in claim 1, wherein: classification of prediction probabilities in step 301
Figure FDA0003266945500000031
The calculation formula of (2) is as follows:
Figure FDA0003266945500000032
where Cat (-) denotes stacking on channels, OneHot (-) denotes unique Hot code, fL[·]Representing a classifier function, y is the true label corresponding to the support set sample, PinitRepresenting the initialized query set sample probability scores.
4. The method for classifying scenes of the small sample remote sensing images based on the iterative feature distribution learning as claimed in claim 1, wherein: loss of coke L in step 303FLThe calculation formula of (2) is as follows:
Figure FDA0003266945500000033
wherein
Figure FDA0003266945500000034
The method comprises the steps of classifying and predicting probability, y is a sample real label, alpha represents a parameter for adjusting loss weight, and gamma represents an adjusting factor for adjusting the descending rate of the sample weight.
5. The method for classifying scenes of the small sample remote sensing images based on the iterative feature distribution learning as claimed in claim 1, wherein: probability distribution matrix in step 403
Figure FDA0003266945500000035
The calculation formula of (2) is as follows:
Figure FDA0003266945500000036
attention weight
Figure FDA0003266945500000037
The calculation formula of (2) is as follows:
Figure FDA0003266945500000038
correction feature Fk+1The calculation formula of (X) is:
Figure FDA0003266945500000039
wherein f isA[·]Representing the attention model function, fC{. denotes a feature learning model function.
6. The method for classifying scenes of small-sample remote sensing images based on iterative feature distribution learning as claimed in claim 1, wherein the method is characterized in that: relationship metric model FRThe network structure of (a) is as follows: input layer → first convolution layer → second convolution layer → third convolution layer; classifier fLThe network structure of (a) is as follows: input layer → first fully connected layer → second fully connected layer; attention model fAThe network structure of (a) is as follows: input layer → first convolution layer → second convolution layer → third convolution layer; feature learning model fCThe network structure of (a) is as follows: input layer → first convolutional layer → second convolutional layer.
7. The small sample remote sensing image scene classification method based on iterative feature distribution learning according to claim 6, characterized in that: the convolutional layers are all based on one basic convolution unit of a neural network VGG, ResNet, GoogleNet or AlexNet.
CN202111089674.XA 2021-09-17 2021-09-17 Small sample remote sensing image scene classification method based on iterative feature distribution learning Pending CN113837046A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111089674.XA CN113837046A (en) 2021-09-17 2021-09-17 Small sample remote sensing image scene classification method based on iterative feature distribution learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111089674.XA CN113837046A (en) 2021-09-17 2021-09-17 Small sample remote sensing image scene classification method based on iterative feature distribution learning

Publications (1)

Publication Number Publication Date
CN113837046A true CN113837046A (en) 2021-12-24

Family

ID=78959652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111089674.XA Pending CN113837046A (en) 2021-09-17 2021-09-17 Small sample remote sensing image scene classification method based on iterative feature distribution learning

Country Status (1)

Country Link
CN (1) CN113837046A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114120048A (en) * 2022-01-26 2022-03-01 中兴通讯股份有限公司 Image processing method, electronic device and computer storage medium
WO2023226227A1 (en) * 2022-05-27 2023-11-30 福建龙氟新材料有限公司 Automatic batching system for preparing electronic-grade hydrofluoric acid and batching method therefor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110097173A (en) * 2019-04-04 2019-08-06 天津大学 A kind of convolutional neural networks topological structure improving network performance and generalization ability
CN113222011A (en) * 2021-05-10 2021-08-06 西北工业大学 Small sample remote sensing image classification method based on prototype correction
US20210264150A1 (en) * 2020-02-26 2021-08-26 Central South University Urban remote sensing image scene classification method in consideration of spatial relationships

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110097173A (en) * 2019-04-04 2019-08-06 天津大学 A kind of convolutional neural networks topological structure improving network performance and generalization ability
US20210264150A1 (en) * 2020-02-26 2021-08-26 Central South University Urban remote sensing image scene classification method in consideration of spatial relationships
CN113222011A (en) * 2021-05-10 2021-08-06 西北工业大学 Small sample remote sensing image classification method based on prototype correction

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QINGJIE ZENG ET AL.: "IDLN: Iterative Distribution Learning Network for Few-Shot Remote Sensing Image Scene Classification", IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, pages 1 - 4 *
耿杰: "基于深度学习的SAR遥感图像分类方法研究", 中国优秀硕士学位论文全文数据库 工程科技II辑 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114120048A (en) * 2022-01-26 2022-03-01 中兴通讯股份有限公司 Image processing method, electronic device and computer storage medium
WO2023226227A1 (en) * 2022-05-27 2023-11-30 福建龙氟新材料有限公司 Automatic batching system for preparing electronic-grade hydrofluoric acid and batching method therefor

Similar Documents

Publication Publication Date Title
CN111191732B (en) Target detection method based on full-automatic learning
CN109190524B (en) Human body action recognition method based on generation of confrontation network
CN114241282A (en) Knowledge distillation-based edge equipment scene identification method and device
CN110070074B (en) Method for constructing pedestrian detection model
CN109671102B (en) Comprehensive target tracking method based on depth feature fusion convolutional neural network
CN111160474A (en) Image identification method based on deep course learning
CN103425996B (en) A kind of large-scale image recognition methods of parallel distributed
CN111079847B (en) Remote sensing image automatic labeling method based on deep learning
CN112132014B (en) Target re-identification method and system based on non-supervised pyramid similarity learning
CN111639679A (en) Small sample learning method based on multi-scale metric learning
CN113222011B (en) Small sample remote sensing image classification method based on prototype correction
CN110942472B (en) Nuclear correlation filtering tracking method based on feature fusion and self-adaptive blocking
CN110516095A (en) Weakly supervised depth Hash social activity image search method and system based on semanteme migration
CN105760821A (en) Classification and aggregation sparse representation face identification method based on nuclear space
CN113128620B (en) Semi-supervised domain self-adaptive picture classification method based on hierarchical relationship
CN113837046A (en) Small sample remote sensing image scene classification method based on iterative feature distribution learning
CN106156805A (en) A kind of classifier training method of sample label missing data
CN110287985B (en) Depth neural network image identification method based on variable topology structure with variation particle swarm optimization
CN110110128B (en) Fast supervised discrete hash image retrieval system for distributed architecture
CN112200262B (en) Small sample classification training method and device supporting multitasking and cross-tasking
CN104680167B (en) Auroral oval location determining method based on deep learning
CN116597244A (en) Small sample target detection method based on meta-learning method
CN111652177A (en) Signal feature extraction method based on deep learning
CN114187506B (en) Remote sensing image scene classification method of viewpoint-aware dynamic routing capsule network
CN117671673B (en) Small sample cervical cell classification method based on self-adaptive tensor subspace

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211224

WD01 Invention patent application deemed withdrawn after publication