CN113947725A - Hyperspectral image classification method based on convolution width migration network - Google Patents
Hyperspectral image classification method based on convolution width migration network Download PDFInfo
- Publication number
- CN113947725A CN113947725A CN202111244668.7A CN202111244668A CN113947725A CN 113947725 A CN113947725 A CN 113947725A CN 202111244668 A CN202111244668 A CN 202111244668A CN 113947725 A CN113947725 A CN 113947725A
- Authority
- CN
- China
- Prior art keywords
- domain
- layer
- network
- adaptation
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the steps of firstly preprocessing an original hyperspectral image by using wave band selection to remove wave band redundancy; and then, adding a domain adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain at the same time, and obtaining a convolutional domain adaptation network to extract the invariance characteristics of the deep domain in the original hyperspectral data. And then proposing the maximum mean difference of the weighting conditions, adding the regular term based on the maximum mean difference of the weighting conditions into a width network to obtain a weighting condition width learning network so as to reduce the difference of the two-domain conditional probability distribution and the class weight deviation, and simultaneously carrying out width expansion on the features. And finally, rapidly calculating an output weight value through a ridge regression theory. The method can complete unsupervised classification of the hyperspectral image of the target domain only by using the source domain labeled sample.
Description
Technical Field
The invention relates to the technical field of pattern recognition, in particular to a hyperspectral image classification method based on a convolution width migration network.
Background
The hyperspectral image contains abundant spectrum and spatial information and has the characteristic of 'map integration'. The characteristic makes it widely used in agriculture, mineral exploration, national defense safety and other fields. Today, hyperspectral image classification has become a research hotspot, which aims to deduce the category to which each pixel in an image belongs according to the hyperspectral image spectrum and spatial information. Many algorithms are proposed to improve the classification accuracy of hyperspectral images, including support vector machines, sparse representations, convolutional neural networks, and so on. The excellent classification performance of these supervised learning algorithms often requires a large number of labeled samples as support. However, a large amount of manpower and material resources are consumed for marking the hyperspectral data, and it is very difficult to obtain large-scale marked data. Therefore, how to learn a classification model with strong generalization capability at low labeling cost is a research hotspot in the field of hyperspectral image analysis nowadays. To solve the above problems, some machine learning algorithms are proposed, including active learning, semi-supervised learning, and data enhancement.
Unlike active learning, which actively reduces labeling costs, and semi-supervised learning, which obtains information from unlabeled samples, transfer learning is capable of transferring knowledge from the relevant domain to other domains, i.e., from a source domain to a target domain. When the target domain has no labeled samples or the labeled samples are insufficient in quantity, the migration learning can solve the classification problem of the target domain by using the samples of the same or similar labels in the source domain.
The deep learning has strong nonlinear representation capability and can extract advanced and compact features of input. The convolutional neural network is the most famous model for deep learning and has strong feature extraction capability. In view of the above advantages, the convolutional neural network-based migration learning model is very valuable for research.
A recently proposed width learning system is a forward neural network model consisting of only three layers, including an input layer, a middle layer, and an output layer. Compared with deep learning, the wide learning can realize the expansion of the features, and has the advantages of simple structure, high calculation speed, easy combination with other models and the like.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a hyperspectral image classification method based on a convolution width migration learning network, which can finish unsupervised classification of a hyperspectral image of a target domain only by using a source domain marker sample.
The technical scheme is as follows: the invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the following steps of:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
Further, the hyperspectral image classification method based on the convolution width migration network further comprises the following steps,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
Further, step 2 specifically includes the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, input X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
Further, step 3 specifically includes the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
equation (3) can be written as:
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
Further, step 4 specifically includes the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
Further, in step 5, the output weight is calculated through a ridge regression theory, which specifically includes the following contents:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
has the advantages that: the method can complete the unsupervised classification of the hyperspectral images of the target domain only by using the source domain labeled samples.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings.
As shown in fig. 1, the hyperspectral image classification method based on the convolution width migration learning network of the invention includes the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
Defining the HSI wave band number of the original hyperspectral image as NbD is the number of wave bands after dimensionality reduction, respectivelyAndselecting a and b HSI bands for the number of intervals, whereinExpressing the rounding-down operation, the following equation can be obtained:
defining reduced X0∈Rn×dAs input to the model, wherein X0Representing hyperspectral image data and n representing the number of samples.
Andrespectively the source domain and target domain data after band selection, wherein nsRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, and aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain to obtain a convolutional field adaptation network CDAN. And classifying the target domain data by using a convolution domain adaptive network CDAN to obtain a target domain pseudo label.
Step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; and obtaining the CDAN of the convolution domain adaptation network.
The convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, connecting the input to the convolution layer through a convolution kernel to obtain a feature mapping, wherein the calculation formula is as follows:
,C=IC*KC+bC
in the formula ICFor convolutional layer input, as convolution operation, KCAs a convolution kernel, bCIs an offset. Connecting the convolutional layer output as an input to a nonlinear layer, the nonlinear layer output being:
in the formula INIs a non-linear layer input. Connecting the nonlinear layer output as an input to a pooling layer, the pooling layer output being:
FP=down(IP)
in the formula IPIs input into the pooling layer. Connecting the pooled layer output as input to the full link layer, the full link layer output being:
FF=,F×WF+bF
in the formula IFIs the input of the full connection layer.
And 2.2, inputting the features subjected to the feature extraction of the full connection layer into a field adaptation layer so as to reduce the distribution difference of a source field and a target field.
Step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Edge domain adaptive parameters and covariance domain adaptive parameters, respectively.
And taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
And 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a field adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN. The method specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Representing the dimension of X.
Taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMFor each set of feature node dimensions, Z represents a mapping feature.
Formula (1) is optimized as:
in the formula, λ represents a regular term coefficient.
Field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcAnd constructing according to the source domain label and the target domain pseudo label.
Equation (3) can be written as:
the above formula can be solved by an alternative direction multiplier method ADMM to obtain the weight A.
Further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
And 4, inputting the features extracted by the CDAN into the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, reducing the difference of two-domain conditional probability distribution and class weight deviation, and performing feature width expansion on the field invariance features. Further enhancing the feature representation capability.
Passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dEIndicating the number of EN nodes.
The source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE)。ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
And 5: and (4) rapidly calculating an output weight value through a ridge regression theory.
And mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]And δ is a regular term coefficient. Solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (6)
1. A hyperspectral image classification method based on a convolution width migration network is characterized by comprising the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
2. The hyperspectral image classification method based on the convolution width migration network according to claim 1, further comprising,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
3. The hyperspectral image classification method based on the convolution width migration network according to claim 1 or 2 is characterized in that the step 2 specifically comprises the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 22, inputting X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
4. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 3 specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
equation (3) can be written as:
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
5. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 4 specifically comprises the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
6. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that in the step 5, the output weight is calculated through a ridge regression theory, and the method specifically comprises the following steps:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111244668.7A CN113947725A (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111244668.7A CN113947725A (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113947725A true CN113947725A (en) | 2022-01-18 |
Family
ID=79332583
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111244668.7A Pending CN113947725A (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113947725A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114723994A (en) * | 2022-04-18 | 2022-07-08 | 中国矿业大学 | Hyperspectral image classification method based on dual-classifier confrontation enhancement network |
CN116150668A (en) * | 2022-12-01 | 2023-05-23 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
-
2021
- 2021-10-26 CN CN202111244668.7A patent/CN113947725A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114723994A (en) * | 2022-04-18 | 2022-07-08 | 中国矿业大学 | Hyperspectral image classification method based on dual-classifier confrontation enhancement network |
CN116150668A (en) * | 2022-12-01 | 2023-05-23 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
CN116150668B (en) * | 2022-12-01 | 2023-08-11 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111695467B (en) | Spatial spectrum full convolution hyperspectral image classification method based on super-pixel sample expansion | |
Jin et al. | Object-oriented method combined with deep convolutional neural networks for land-use-type classification of remote sensing images | |
CN113947725A (en) | Hyperspectral image classification method based on convolution width migration network | |
CN111275092A (en) | Image classification method based on unsupervised domain adaptation | |
CN115410088B (en) | Hyperspectral image field self-adaption method based on virtual classifier | |
CN111008644B (en) | Ecological change monitoring method based on local dynamic energy function FCN-CRF model | |
Chen et al. | Application of improved convolutional neural network in image classification | |
Wang et al. | Hyperspectral image classification based on domain adversarial broad adaptation network | |
CN111797911A (en) | Image data multi-label classification method | |
CN107273919A (en) | A kind of EO-1 hyperion unsupervised segmentation method that generic dictionary is constructed based on confidence level | |
Wu et al. | A remote sensing image classification method based on sparse representation | |
Dai et al. | Research on hyper-spectral remote sensing image classification by applying stacked de-noising auto-encoders neural network | |
CN114049567B (en) | Adaptive soft label generation method and application in hyperspectral image classification | |
CN110222793B (en) | Online semi-supervised classification method and system based on multi-view active learning | |
CN112686305A (en) | Semi-supervised learning method and system under assistance of self-supervised learning | |
Dahiya et al. | A review on deep learning classifier for hyperspectral imaging | |
CN116597312A (en) | Crop leaf disease and pest identification method based on small sample image semantic segmentation | |
CN113570161B (en) | Method for constructing stirred tank reactant concentration prediction model based on width transfer learning | |
CN113408652B (en) | Semi-supervised learning image classification method based on group representation features | |
CN115496948A (en) | Network supervision fine-grained image identification method and system based on deep learning | |
CN114494762A (en) | Hyperspectral image classification method based on deep migration network | |
Molina et al. | A model-based deep transfer learning algorithm for phenology forecasting using satellite imagery | |
CN113987170A (en) | Multi-label text classification method based on convolutional neural network | |
Luo et al. | Anomaly detection for image data based on data distribution and reconstruction | |
CN111897988A (en) | Hyperspectral remote sensing image classification method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |