CN111401426A - Small sample hyperspectral image classification method based on pseudo label learning - Google Patents

Small sample hyperspectral image classification method based on pseudo label learning Download PDF

Info

Publication number
CN111401426A
CN111401426A CN202010167527.9A CN202010167527A CN111401426A CN 111401426 A CN111401426 A CN 111401426A CN 202010167527 A CN202010167527 A CN 202010167527A CN 111401426 A CN111401426 A CN 111401426A
Authority
CN
China
Prior art keywords
sample
data set
samples
unlabeled
small sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010167527.9A
Other languages
Chinese (zh)
Other versions
CN111401426B (en
Inventor
魏巍
李宇
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202010167527.9A priority Critical patent/CN111401426B/en
Publication of CN111401426A publication Critical patent/CN111401426A/en
Application granted granted Critical
Publication of CN111401426B publication Critical patent/CN111401426B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a small sample hyperspectral image classification method based on pseudo tag learning. Firstly, sampling surrounding pixels by taking a hyperspectral pixel point as a center to generate a hyperspectral sample, constructing a small sample data set by using a small number of labeled samples, and distributing soft-pseudo labels for unlabeled samples by using the small sample data set to obtain an auxiliary data set; then, training a two-branch deep neural network formed by a shared feature extractor and two different classifiers by using the small sample data set and the auxiliary data set; and finally, predicting labels for the test data by using the trained network to realize classification processing. According to the method, the potential information of the unlabeled sample is effectively utilized, so that the accuracy of classifying the hyperspectral images of the small samples can be improved.

Description

Small sample hyperspectral image classification method based on pseudo label learning
Technical Field
The invention belongs to the technical field of hyperspectral image processing, and particularly relates to a small sample hyperspectral image classification method based on pseudo tag learning.
Background
Unlike conventional machine learning methods, deep neural networks have strong feature representation capability, and in order to solve the problem of gradient disappearance of deep neural networks, documents of' Zhong hong Z, L i J, L uo Z, et al. spectral-spatial similarity analysis, A3-D deep learning frame work [ J ]. eetransactions on geon similarity analysis and prediction, 2017,56(2): 7, 8456, 2) use a space information model to extract a large amount of spectral data, and the neural network classification model is difficult to be used for efficient depth training.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a small sample hyperspectral image classification method based on pseudo tag learning. Firstly, sampling surrounding pixels by taking a hyperspectral pixel point as a center to generate a hyperspectral sample, constructing a small sample data set by using a small number of labeled hyperspectral samples, and distributing soft-pseudo labels for unlabeled samples by using the small sample data set to obtain an auxiliary data set; then, training a two-branch deep neural network formed by a shared feature extractor and two different classifiers by using the small sample data set and the auxiliary data set; and finally, predicting labels for the test data by using the trained network to realize classification processing. According to the method, the potential information of the unlabeled sample is effectively utilized, so that the accuracy of classifying the hyperspectral images of the small samples can be improved.
A small sample hyperspectral image classification method based on pseudo tag learning comprises the following steps:
step 1: sampling surrounding pixels by taking hyperspectral pixel points as centers to generate hyperspectral data samples, and forming a small sample dataset T { (x) by using samples with labels1,y1),(x2,y2),…,(xn,yn) } of whichIn (1),
Figure BDA0002407984800000011
Figure BDA0002407984800000021
representing the ith marker sample, w is the size of a sampling window, w is 3, 5 or 7, d is the wave band number of the hyperspectral image,
Figure BDA0002407984800000022
for the label of the ith marking sample, label yiThe method adopts a form of unique hot coding, wherein the form is L-dimensional vector, L is category number, i is 1,2, …, n and n are the number of marked samples, and a test data set U is formed by unlabeled samples1,z2,…,zmAnd (c) the step of (c) in which,
Figure BDA0002407984800000023
representing the jth unlabeled sample, wherein j is 1,2, …, m is the number of the unlabeled samples, the number n of the labeled samples is L, 3, L or 5, L, and the number m of the unlabeled samples is the difference between the total number of the samples and the number of the labeled samples;
step 2: marking the sample h with the jth of the ith classijJth agent as class iij,i=1,2,…,L,j=1,2,…,in,inRepresenting the number of i-th type labeled samples; respectively calculating the p-th unlabeled sample zpDistance from all agents, p 1,2, …, m, and the agent in each class closest to it is selected as unlabeled sample zpReference Agents in this class, get unlabeled samples zpReference proxy sequence of
Figure BDA0002407984800000024
Wherein the content of the first and second substances,
Figure BDA0002407984800000025
represents an unlabeled sample zpA reference agent in class i; then according to
Figure BDA0002407984800000026
Figure BDA0002407984800000027
Calculating to obtain an unlabeled sample zpSoft-pseudo label vector of
Figure BDA0002407984800000028
To get the value of the ith element in (c), thereby obtaining the unlabeled sample zpSoft-pseudo label vector of
Figure BDA0002407984800000029
Wherein the content of the first and second substances,
Figure BDA00024079848000000210
representing reference agents
Figure BDA00024079848000000211
With unlabelled specimen zpThe distance between them, softmax (·) is a normalized exponential function; auxiliary data set is formed by all unmarked samples and soft-false labels thereof in test data set
Figure BDA00024079848000000212
And step 3: constructing a two-branch deep neural network, which comprises a feature extractor and two classifiers, wherein the two classifiers are a small sample classifier and an auxiliary sample classifier respectively, the feature extractor and the small sample classifier form a small sample classification branch of the network, and the feature extractor and the auxiliary sample classifier form an auxiliary sample classification branch of the network; the feature extractor is any convolutional neural network, and the classifier is only one or a plurality of fully-connected layers;
inputting a small sample data set into a small sample classification branch, inputting an auxiliary data set into an auxiliary sample classification branch, respectively extracting the auxiliary data set by a feature extractor to obtain distinguishing features of the auxiliary data set, and respectively obtaining category prediction labels of the auxiliary data set by different classifiers; setting that the loss functions of the small sample data set classifier and the auxiliary data set classifier are cross entropy functions, and performing optimization training on the network by adopting a random gradient descent method and an Adam optimizer to obtain a trained network;
and 4, step 4: and (4) inputting the test data set into the small sample classification branch in the trained network obtained in the step (3) to obtain a classification result.
The invention has the beneficial effects that: because the soft-pseudo label learning mode is adopted to distribute labels for the unmarked samples, the internal information of the unmarked samples can be effectively utilized; the auxiliary data set is used for increasing constraint for training the feature extractor of the small sample data set, overfitting of a classification model can be effectively prevented, and hyperspectral image classification precision is remarkably improved.
Detailed Description
The present invention is further illustrated by the following examples, which include, but are not limited to, the following examples.
The invention provides a small sample hyperspectral image classification method based on pseudo tag learning, which is basically realized in the following processes:
1. building a data set
Sampling surrounding pixels by taking a hyperspectral pixel point as a center to generate a hyperspectral cube (namely a sample), and forming a small sample dataset T by using a labeled sample, namely T { (x)1,y1),(x2,y2),…,(xn,yn) And (c) the step of (c) in which,
Figure BDA0002407984800000031
Figure BDA0002407984800000032
representing the ith labeled sample (namely, a labeled sample), w is the size of a sampling window (the value of 3, 5, 7 can be taken as equivalent), and d is the number of wave bands of the hyperspectral image;
Figure BDA0002407984800000033
label for ith sample, label yiThe method adopts a form of one-hot coding, namely L-dimensional vector, L is category number, i is 1,2, …, n, n is the number of marked samples, and a test data set U is formed by unlabeled samples, namely U is { z }1,z2,…,zmAnd (c) the step of (c) in which,
Figure BDA0002407984800000034
the value range of the number n of the marked samples is L, 3 x L and 5 x L, namely only 1, 3 and 5 samples with labels are provided in each class, the value range of the number m of the unmarked samples is the difference between the total number of the samples and the number of the marked samples, and m is provided on a plurality of reference hyperspectral data sets>10000。
2. Soft-pseudo label learning
The invention takes each marked sample as a proxy, and assigns a unique soft-pseudo label to each unmarked sample by calculating the distance between the unmarked sample and all the proxies.
First, a sample is marked
Figure BDA0002407984800000035
To be considered as a proxy:
agentij=hij(1)
wherein, agentijJ-th agent, h, representing the i-th classijThe jth labeled sample of the ith class, i 1,2, …, L, j 1,2, …, in,inRepresenting the total number of samples of the ith class.
Respectively calculating the p-th unlabeled sample
Figure BDA0002407984800000036
The distance between the agent and all the agents is the jth agent of the ith classijFor example, the following steps are carried out:
Figure BDA0002407984800000037
wherein the content of the first and second substances,
Figure BDA0002407984800000038
representing agentijAnd sample zpThe distance between the two is sim (·,) similarity function, which can be selected arbitrarilyDistance formula.
Selecting a distance z in each classpNearest agent as unlabeled sample zpReference agents in this class to select zpReference proxy in class i
Figure BDA0002407984800000041
For example, the following steps are carried out:
Figure BDA0002407984800000042
unlabeled specimen zpReference proxy sequence of
Figure BDA0002407984800000043
That is, it can be obtained by equation 3.
Thus, the unlabeled sample zpSoft-pseudo label vector of
Figure BDA0002407984800000044
The value of the ith element in (b) can be obtained by:
Figure BDA0002407984800000045
where softmax (·) is a normalized exponential function, the input is mapped to real numbers between 0-1, and the normalized guaranteed sum is 1. Thereby obtaining an unlabeled sample zpSoft-pseudo label vector of
Figure BDA0002407984800000046
Figure BDA0002407984800000047
For each unlabeled sample zpAll (p ═ 1,2, …, m) are processed as above to obtain their corresponding soft-pseudo labels.
The auxiliary data set S is formed by all unmarked samples and soft-false labels thereof in the test data set, i.e.
Figure BDA0002407984800000048
Figure BDA0002407984800000049
3. Building a network and training
The invention uses a shared feature extractor to extract the discriminant features on the auxiliary data set and the small sample data set, and constructs different classifiers on the two data sets respectively to map the discriminant features to the label space. I.e. co-training the feature extractor F with the small sample data set T and the auxiliary data set SθTheta is a parameter of the feature extractor, different classifiers C (-) and C' (-) are trained on the T, S data set respectively to form a two-branch deep neural network, and hyperspectral image classification is achieved. The feature extractor can be any three-dimensional convolutional neural network structure, and the classifier can also be one or more layers of full-connection layer structures.
For the classification branch of a small sample dataset T, sample xkPredictive label of
Figure BDA00024079848000000410
Can be obtained by the following formula:
Figure BDA00024079848000000411
likewise, for the classification branch of the secondary data set S, the predicted values of the pseudo-tags
Figure BDA00024079848000000412
And sample zkThe relationship between (a) and (b) can be obtained by the following formula:
Figure BDA00024079848000000413
on the classification branch of the small sample dataset T, the parameters in the neural network are updated using the following cross-entropy function as the loss function:
Figure BDA0002407984800000051
wherein, LfewRepresenting the loss function on the classification branch of a small sample dataset, 1[j=yk]Means j ═ ykThe value is 1 when the value is out, and the value is 0 when the value is out.
Likewise, the parameters of the classification branch of the secondary data set S may also be updated with the following equation:
Figure BDA0002407984800000052
wherein the content of the first and second substances,
Figure BDA0002407984800000053
presentation label
Figure BDA0002407984800000054
The (j) th element of (a),
Figure BDA0002407984800000055
presentation label
Figure BDA0002407984800000056
L th element ofauxRepresenting the loss function on the classification branch of the secondary data set.
And (3) optimizing the two cross entropy loss functions by using a random gradient descent method and an Adam optimizer to obtain a trained network.
4. Sorting process
Test data set U ═ z1,z2,…,zmAnd (4) inputting the training small sample data set obtained in the step (3) to classify one sample to obtain a classification result.
To verify the effectiveness of the method of the invention, experiments on three reference data sets, Indian pipes, PaviaUniversity and salanas were compared with the method proposed in the document "Zhong Z, L i J, L uo Z, et al spectral-spatial residual network for hyperspectral image classification: A3-D discarding frame [ J ]. IEEE Transactions on Geoscience and removal Sensing,2017,56(2): 847-.

Claims (1)

1. A small sample hyperspectral image classification method based on pseudo tag learning comprises the following steps:
step 1: sampling surrounding pixels by taking hyperspectral pixel points as centers to generate hyperspectral data samples, and forming a small sample dataset T { (x) by using samples with labels1,y1),(x2,y2),…,(xn,yn) And (c) the step of (c) in which,
Figure FDA0002407984790000011
Figure FDA0002407984790000012
representing the ith marker sample, w is the size of a sampling window, w is 3, 5 or 7, d is the wave band number of the hyperspectral image,
Figure FDA0002407984790000013
for the label of the ith marking sample, label yiThe method adopts a form of unique hot coding, wherein the form is L-dimensional vector, L is category number, i is 1,2, …, n and n are the number of marked samples, and a test data set U is formed by unlabeled samples1,z2,…,zmAnd (c) the step of (c) in which,
Figure FDA0002407984790000014
representing the jth unlabeled sample, wherein j is 1,2, …, m is the number of the unlabeled samples, the number n of the labeled samples is L, 3, L or 5, L, and the number m of the unlabeled samples is the difference between the total number of the samples and the number of the labeled samples;
step 2: marking the sample h with the jth of the ith classijJth agent as class iij,i=1,2,…,L,j=1,2,…,in,inRepresenting the number of i-th type labeled samples; respectively calculating the p-th unlabeled sample zpDistance from all agents, p 1,2, …, m, and the agent in each class closest to it is selected as unlabeled sample zpReference Agents in this class, get unlabeled samples zpReference proxy sequence of
Figure FDA0002407984790000015
Wherein the content of the first and second substances,
Figure FDA0002407984790000016
represents an unlabeled sample zpA reference agent in class i; then according to
Figure FDA0002407984790000017
Calculating to obtain an unlabeled sample zpSoft-pseudo label vector of
Figure FDA0002407984790000018
To get the value of the ith element in (c), thereby obtaining the unlabeled sample zpSoft-pseudo label vector of
Figure FDA0002407984790000019
Wherein the content of the first and second substances,
Figure FDA00024079847900000110
representing reference agents
Figure FDA00024079847900000111
With unlabelled specimen zpThe distance between them, softmax (·) is a normalized exponential function; auxiliary data set is formed by all unmarked samples and soft-false labels thereof in test data set
Figure FDA00024079847900000112
And step 3: constructing a two-branch deep neural network, which comprises a feature extractor and two classifiers, wherein the two classifiers are a small sample classifier and an auxiliary sample classifier respectively, the feature extractor and the small sample classifier form a small sample classification branch of the network, and the feature extractor and the auxiliary sample classifier form an auxiliary sample classification branch of the network; the feature extractor is any convolutional neural network, and the classifier is only one or a plurality of fully-connected layers;
inputting a small sample data set into a small sample classification branch, inputting an auxiliary data set into an auxiliary sample classification branch, respectively extracting the auxiliary data set by a feature extractor to obtain distinguishing features of the auxiliary data set, and respectively obtaining category prediction labels of the auxiliary data set by different classifiers; setting that the loss functions of the small sample data set classifier and the auxiliary data set classifier are cross entropy functions, and performing optimization training on the network by adopting a random gradient descent method and an Adam optimizer to obtain a trained network;
and 4, step 4: and (4) inputting the test data set into the small sample classification branch in the trained network obtained in the step (3) to obtain a classification result.
CN202010167527.9A 2020-03-11 2020-03-11 Small sample hyperspectral image classification method based on pseudo label learning Active CN111401426B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010167527.9A CN111401426B (en) 2020-03-11 2020-03-11 Small sample hyperspectral image classification method based on pseudo label learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010167527.9A CN111401426B (en) 2020-03-11 2020-03-11 Small sample hyperspectral image classification method based on pseudo label learning

Publications (2)

Publication Number Publication Date
CN111401426A true CN111401426A (en) 2020-07-10
CN111401426B CN111401426B (en) 2022-04-08

Family

ID=71432388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010167527.9A Active CN111401426B (en) 2020-03-11 2020-03-11 Small sample hyperspectral image classification method based on pseudo label learning

Country Status (1)

Country Link
CN (1) CN111401426B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651317A (en) * 2020-12-18 2021-04-13 中国电子科技集团公司信息科学研究院 Hyperspectral image classification method and system for sample relation learning
CN112816499A (en) * 2021-04-19 2021-05-18 征图新视(江苏)科技股份有限公司 Hyperspectral and deep learning combined industrial detection system
CN113408605A (en) * 2021-06-16 2021-09-17 西安电子科技大学 Hyperspectral image semi-supervised classification method based on small sample learning
CN114461802A (en) * 2022-02-09 2022-05-10 湘潭大学 Self-training method of machine reading understanding model for question refusing to answer
CN114821198A (en) * 2022-06-24 2022-07-29 齐鲁工业大学 Cross-domain hyperspectral image classification method based on self-supervision and small sample learning

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060235812A1 (en) * 2005-04-14 2006-10-19 Honda Motor Co., Ltd. Partially supervised machine learning of data classification based on local-neighborhood Laplacian Eigenmaps
CN104182767A (en) * 2014-09-05 2014-12-03 西安电子科技大学 Active learning and neighborhood information combined hyperspectral image classification method
CN104732246A (en) * 2015-03-05 2015-06-24 重庆邮电大学 Semi-supervised cooperative training hyperspectral image classification method
CN105046630A (en) * 2014-04-04 2015-11-11 影像搜索者公司 image tag add system
CN108596256A (en) * 2018-04-26 2018-09-28 北京航空航天大学青岛研究院 One kind being based on RGB-D object identification grader building methods
CN109858557A (en) * 2019-02-13 2019-06-07 安徽大学 A kind of new hyperspectral image data semisupervised classification algorithm
CN109948708A (en) * 2019-03-21 2019-06-28 西安电子科技大学 Multispectral image feature level information fusion method when more based on the implicit canonical of iteration
CN109978071A (en) * 2019-04-03 2019-07-05 西北工业大学 Hyperspectral image classification method based on data augmentation and Multiple Classifier Fusion
CN110009015A (en) * 2019-03-25 2019-07-12 西北工业大学 EO-1 hyperion small sample classification method based on lightweight network and semi-supervised clustering
CN110309868A (en) * 2019-06-24 2019-10-08 西北工业大学 In conjunction with the hyperspectral image classification method of unsupervised learning
CN110363178A (en) * 2019-07-23 2019-10-22 上海黑塞智能科技有限公司 The airborne laser point cloud classification method being embedded in based on part and global depth feature

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060235812A1 (en) * 2005-04-14 2006-10-19 Honda Motor Co., Ltd. Partially supervised machine learning of data classification based on local-neighborhood Laplacian Eigenmaps
CN105046630A (en) * 2014-04-04 2015-11-11 影像搜索者公司 image tag add system
CN104182767A (en) * 2014-09-05 2014-12-03 西安电子科技大学 Active learning and neighborhood information combined hyperspectral image classification method
CN104732246A (en) * 2015-03-05 2015-06-24 重庆邮电大学 Semi-supervised cooperative training hyperspectral image classification method
CN108596256A (en) * 2018-04-26 2018-09-28 北京航空航天大学青岛研究院 One kind being based on RGB-D object identification grader building methods
CN109858557A (en) * 2019-02-13 2019-06-07 安徽大学 A kind of new hyperspectral image data semisupervised classification algorithm
CN109948708A (en) * 2019-03-21 2019-06-28 西安电子科技大学 Multispectral image feature level information fusion method when more based on the implicit canonical of iteration
CN110009015A (en) * 2019-03-25 2019-07-12 西北工业大学 EO-1 hyperion small sample classification method based on lightweight network and semi-supervised clustering
CN109978071A (en) * 2019-04-03 2019-07-05 西北工业大学 Hyperspectral image classification method based on data augmentation and Multiple Classifier Fusion
CN110309868A (en) * 2019-06-24 2019-10-08 西北工业大学 In conjunction with the hyperspectral image classification method of unsupervised learning
CN110363178A (en) * 2019-07-23 2019-10-22 上海黑塞智能科技有限公司 The airborne laser point cloud classification method being embedded in based on part and global depth feature

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JINYANG ZHANG 等: "IMPROVING HYPERSPECTRAL IMAGE CLASSIFICATION WITH UNSUPERVISED KNOWLEDGE LEARNING", 《IGARSS 2019》 *
JYOTI MAGGU 等: "Label Consistent Transform Learning for Hyperspectral Image Classification", 《ARXIV:1912.11405V1》 *
LEI ZHANG 等: "Towards Effective Deep Embedding for Zero-Shot Learning", 《ARXIV:1808.10075V2》 *
刘丽丽 等: "基于伪标签深度学习的高光谱影像半监督分类", 《计算机工程与应用》 *
李绣心 等: "基于卷积神经网络的半监督高光谱图像分类", 《电子测量与仪器学报》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651317A (en) * 2020-12-18 2021-04-13 中国电子科技集团公司信息科学研究院 Hyperspectral image classification method and system for sample relation learning
CN112651317B (en) * 2020-12-18 2022-11-25 中国电子科技集团公司信息科学研究院 Hyperspectral image classification method and system for sample relation learning
CN112816499A (en) * 2021-04-19 2021-05-18 征图新视(江苏)科技股份有限公司 Hyperspectral and deep learning combined industrial detection system
CN112816499B (en) * 2021-04-19 2021-06-29 征图新视(江苏)科技股份有限公司 Hyperspectral and deep learning combined industrial detection system
CN113408605A (en) * 2021-06-16 2021-09-17 西安电子科技大学 Hyperspectral image semi-supervised classification method based on small sample learning
CN114461802A (en) * 2022-02-09 2022-05-10 湘潭大学 Self-training method of machine reading understanding model for question refusing to answer
CN114821198A (en) * 2022-06-24 2022-07-29 齐鲁工业大学 Cross-domain hyperspectral image classification method based on self-supervision and small sample learning

Also Published As

Publication number Publication date
CN111401426B (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN111368896B (en) Hyperspectral remote sensing image classification method based on dense residual three-dimensional convolutional neural network
CN111401426B (en) Small sample hyperspectral image classification method based on pseudo label learning
de Lima et al. Petrographic microfacies classification with deep convolutional neural networks
Su et al. Rock classification in petrographic thin section images based on concatenated convolutional neural networks
CN103886342B (en) Hyperspectral image classification method based on spectrums and neighbourhood information dictionary learning
CN107145830A (en) Hyperspectral image classification method with depth belief network is strengthened based on spatial information
CN107392237B (en) Cross-domain foundation cloud picture classification method based on migration visual information
CN108830312B (en) Integrated learning method based on sample adaptive expansion
CN109886161A (en) A kind of road traffic index identification method based on possibility cluster and convolutional neural networks
CN113378792B (en) Weak supervision cervical cell image analysis method fusing global and local information
CN105069478A (en) Hyperspectral remote sensing surface feature classification method based on superpixel-tensor sparse coding
CN106203522A (en) Hyperspectral image classification method based on three-dimensional non-local mean filtering
CN110414616B (en) Remote sensing image dictionary learning and classifying method utilizing spatial relationship
CN111639697B (en) Hyperspectral image classification method based on non-repeated sampling and prototype network
CN109472733A (en) Image latent writing analysis method based on convolutional neural networks
Rachmad et al. Mycobacterium tuberculosis images classification based on combining of convolutional neural network and support vector machine
CN113673556A (en) Hyperspectral image classification method based on multi-scale dense convolution network
CN114119585A (en) Method for identifying key feature enhanced gastric cancer image based on Transformer
Seo et al. Classification of igneous rocks from petrographic thin section images using convolutional neural network
CN105894035B (en) SAR image classification method based on SAR-SIFT and DBN
CN107273919A (en) A kind of EO-1 hyperion unsupervised segmentation method that generic dictionary is constructed based on confidence level
CN114373079A (en) Rapid and accurate ground penetrating radar target detection method
CN110378307B (en) Texture image direction field estimation method based on deep learning
CN113989528B (en) Hyperspectral image characteristic representation method based on depth joint sparse-collaborative representation
CN113887652B (en) Remote sensing image weak and small target detection method based on morphology and multi-example learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant