CN113947725A - Hyperspectral image classification method based on convolution width migration network - Google Patents
Hyperspectral image classification method based on convolution width migration network Download PDFInfo
- Publication number
- CN113947725A CN113947725A CN202111244668.7A CN202111244668A CN113947725A CN 113947725 A CN113947725 A CN 113947725A CN 202111244668 A CN202111244668 A CN 202111244668A CN 113947725 A CN113947725 A CN 113947725A
- Authority
- CN
- China
- Prior art keywords
- domain
- layer
- network
- adaptation
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000005012 migration Effects 0.000 title claims abstract description 16
- 238000013508 migration Methods 0.000 title claims abstract description 16
- 230000006978 adaptation Effects 0.000 claims abstract description 65
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 10
- 238000013507 mapping Methods 0.000 claims description 22
- 230000003044 adaptive effect Effects 0.000 claims description 9
- 230000006870 function Effects 0.000 claims description 9
- 238000011176 pooling Methods 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 150000001875 compounds Chemical class 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 6
- 230000009467 reduction Effects 0.000 claims description 4
- 239000011541 reaction mixture Substances 0.000 claims description 3
- 239000004576 sand Substances 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 abstract 1
- 238000013135 deep learning Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000013145 classification model Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the steps of firstly preprocessing an original hyperspectral image by using wave band selection to remove wave band redundancy; and then, adding a domain adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain at the same time, and obtaining a convolutional domain adaptation network to extract the invariance characteristics of the deep domain in the original hyperspectral data. And then proposing the maximum mean difference of the weighting conditions, adding the regular term based on the maximum mean difference of the weighting conditions into a width network to obtain a weighting condition width learning network so as to reduce the difference of the two-domain conditional probability distribution and the class weight deviation, and simultaneously carrying out width expansion on the features. And finally, rapidly calculating an output weight value through a ridge regression theory. The method can complete unsupervised classification of the hyperspectral image of the target domain only by using the source domain labeled sample.
Description
Technical Field
The invention relates to the technical field of pattern recognition, in particular to a hyperspectral image classification method based on a convolution width migration network.
Background
The hyperspectral image contains abundant spectrum and spatial information and has the characteristic of 'map integration'. The characteristic makes it widely used in agriculture, mineral exploration, national defense safety and other fields. Today, hyperspectral image classification has become a research hotspot, which aims to deduce the category to which each pixel in an image belongs according to the hyperspectral image spectrum and spatial information. Many algorithms are proposed to improve the classification accuracy of hyperspectral images, including support vector machines, sparse representations, convolutional neural networks, and so on. The excellent classification performance of these supervised learning algorithms often requires a large number of labeled samples as support. However, a large amount of manpower and material resources are consumed for marking the hyperspectral data, and it is very difficult to obtain large-scale marked data. Therefore, how to learn a classification model with strong generalization capability at low labeling cost is a research hotspot in the field of hyperspectral image analysis nowadays. To solve the above problems, some machine learning algorithms are proposed, including active learning, semi-supervised learning, and data enhancement.
Unlike active learning, which actively reduces labeling costs, and semi-supervised learning, which obtains information from unlabeled samples, transfer learning is capable of transferring knowledge from the relevant domain to other domains, i.e., from a source domain to a target domain. When the target domain has no labeled samples or the labeled samples are insufficient in quantity, the migration learning can solve the classification problem of the target domain by using the samples of the same or similar labels in the source domain.
The deep learning has strong nonlinear representation capability and can extract advanced and compact features of input. The convolutional neural network is the most famous model for deep learning and has strong feature extraction capability. In view of the above advantages, the convolutional neural network-based migration learning model is very valuable for research.
A recently proposed width learning system is a forward neural network model consisting of only three layers, including an input layer, a middle layer, and an output layer. Compared with deep learning, the wide learning can realize the expansion of the features, and has the advantages of simple structure, high calculation speed, easy combination with other models and the like.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a hyperspectral image classification method based on a convolution width migration learning network, which can finish unsupervised classification of a hyperspectral image of a target domain only by using a source domain marker sample.
The technical scheme is as follows: the invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the following steps of:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
Further, the hyperspectral image classification method based on the convolution width migration network further comprises the following steps,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
Further, step 2 specifically includes the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, input X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
Further, step 3 specifically includes the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
equation (3) can be written as:
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
Further, step 4 specifically includes the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
Further, in step 5, the output weight is calculated through a ridge regression theory, which specifically includes the following contents:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
has the advantages that: the method can complete the unsupervised classification of the hyperspectral images of the target domain only by using the source domain labeled samples.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings.
As shown in fig. 1, the hyperspectral image classification method based on the convolution width migration learning network of the invention includes the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
Defining the HSI wave band number of the original hyperspectral image as NbD is the number of wave bands after dimensionality reduction, respectivelyAndselecting a and b HSI bands for the number of intervals, whereinExpressing the rounding-down operation, the following equation can be obtained:
defining reduced X0∈Rn×dAs input to the model, wherein X0Representing hyperspectral image data and n representing the number of samples.
Andrespectively the source domain and target domain data after band selection, wherein nsRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, and aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain to obtain a convolutional field adaptation network CDAN. And classifying the target domain data by using a convolution domain adaptive network CDAN to obtain a target domain pseudo label.
Step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; and obtaining the CDAN of the convolution domain adaptation network.
The convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, connecting the input to the convolution layer through a convolution kernel to obtain a feature mapping, wherein the calculation formula is as follows:
,C=IC*KC+bC
in the formula ICFor convolutional layer input, as convolution operation, KCAs a convolution kernel, bCIs an offset. Connecting the convolutional layer output as an input to a nonlinear layer, the nonlinear layer output being:
in the formula INIs a non-linear layer input. Connecting the nonlinear layer output as an input to a pooling layer, the pooling layer output being:
FP=down(IP)
in the formula IPIs input into the pooling layer. Connecting the pooled layer output as input to the full link layer, the full link layer output being:
FF=,F×WF+bF
in the formula IFIs the input of the full connection layer.
And 2.2, inputting the features subjected to the feature extraction of the full connection layer into a field adaptation layer so as to reduce the distribution difference of a source field and a target field.
Step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Edge domain adaptive parameters and covariance domain adaptive parameters, respectively.
And taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
And 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a field adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN. The method specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Representing the dimension of X.
Taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMFor each set of feature node dimensions, Z represents a mapping feature.
Formula (1) is optimized as:
in the formula, λ represents a regular term coefficient.
Field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcAnd constructing according to the source domain label and the target domain pseudo label.
Equation (3) can be written as:
the above formula can be solved by an alternative direction multiplier method ADMM to obtain the weight A.
Further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
And 4, inputting the features extracted by the CDAN into the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, reducing the difference of two-domain conditional probability distribution and class weight deviation, and performing feature width expansion on the field invariance features. Further enhancing the feature representation capability.
Passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dEIndicating the number of EN nodes.
The source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE)。ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
And 5: and (4) rapidly calculating an output weight value through a ridge regression theory.
And mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]And δ is a regular term coefficient. Solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (6)
1. A hyperspectral image classification method based on a convolution width migration network is characterized by comprising the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
2. The hyperspectral image classification method based on the convolution width migration network according to claim 1, further comprising,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
3. The hyperspectral image classification method based on the convolution width migration network according to claim 1 or 2 is characterized in that the step 2 specifically comprises the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation itemsCovariance field adaptation entryWherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 22, inputting X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
in the formula (I), the compound is shown in the specification,for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
4. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 3 specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDANWherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAi+βei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
equation (3) can be written as:
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
and obtaining the weighted condition width learning network through the steps.
5. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 4 specifically comprises the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,σ (.) is the tansig function,is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
6. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that in the step 5, the output weight is calculated through a ridge regression theory, and the method specifically comprises the following steps:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111244668.7A CN113947725B (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111244668.7A CN113947725B (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113947725A true CN113947725A (en) | 2022-01-18 |
CN113947725B CN113947725B (en) | 2024-06-14 |
Family
ID=79332583
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111244668.7A Active CN113947725B (en) | 2021-10-26 | 2021-10-26 | Hyperspectral image classification method based on convolution width migration network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113947725B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114723994A (en) * | 2022-04-18 | 2022-07-08 | 中国矿业大学 | Hyperspectral image classification method based on dual-classifier confrontation enhancement network |
CN116150668A (en) * | 2022-12-01 | 2023-05-23 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
CN118247584A (en) * | 2024-05-28 | 2024-06-25 | 安徽大学 | Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109344891A (en) * | 2018-09-21 | 2019-02-15 | 北京航空航天大学 | A kind of high-spectrum remote sensing data classification method based on deep neural network |
CN109359623A (en) * | 2018-11-13 | 2019-02-19 | 西北工业大学 | High spectrum image based on depth Joint Distribution adaptation network migrates classification method |
US20200130177A1 (en) * | 2018-10-29 | 2020-04-30 | Hrl Laboratories, Llc | Systems and methods for few-shot transfer learning |
CN111709448A (en) * | 2020-05-20 | 2020-09-25 | 西安交通大学 | Mechanical fault diagnosis method based on migration relation network |
AU2020103905A4 (en) * | 2020-12-04 | 2021-02-11 | Chongqing Normal University | Unsupervised cross-domain self-adaptive medical image segmentation method based on deep adversarial learning |
CN112906756A (en) * | 2021-01-28 | 2021-06-04 | 昆山则立诚智能设备有限公司 | High-image classification method and system for cross-channel quantity transfer learning |
-
2021
- 2021-10-26 CN CN202111244668.7A patent/CN113947725B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109344891A (en) * | 2018-09-21 | 2019-02-15 | 北京航空航天大学 | A kind of high-spectrum remote sensing data classification method based on deep neural network |
US20200130177A1 (en) * | 2018-10-29 | 2020-04-30 | Hrl Laboratories, Llc | Systems and methods for few-shot transfer learning |
CN109359623A (en) * | 2018-11-13 | 2019-02-19 | 西北工业大学 | High spectrum image based on depth Joint Distribution adaptation network migrates classification method |
CN111709448A (en) * | 2020-05-20 | 2020-09-25 | 西安交通大学 | Mechanical fault diagnosis method based on migration relation network |
AU2020103905A4 (en) * | 2020-12-04 | 2021-02-11 | Chongqing Normal University | Unsupervised cross-domain self-adaptive medical image segmentation method based on deep adversarial learning |
CN112906756A (en) * | 2021-01-28 | 2021-06-04 | 昆山则立诚智能设备有限公司 | High-image classification method and system for cross-channel quantity transfer learning |
Non-Patent Citations (1)
Title |
---|
樊帅昌;易晓梅;李剑;惠国华;郜园园;: "基于深度残差网络与迁移学习的毒蕈图像识别", 传感技术学报, no. 01, 15 January 2020 (2020-01-15) * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114723994A (en) * | 2022-04-18 | 2022-07-08 | 中国矿业大学 | Hyperspectral image classification method based on dual-classifier confrontation enhancement network |
CN116150668A (en) * | 2022-12-01 | 2023-05-23 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
CN116150668B (en) * | 2022-12-01 | 2023-08-11 | 中国矿业大学 | Rotating equipment fault diagnosis method based on double-stage alignment partial migration network |
CN118247584A (en) * | 2024-05-28 | 2024-06-25 | 安徽大学 | Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium |
CN118247584B (en) * | 2024-05-28 | 2024-08-09 | 安徽大学 | Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113947725B (en) | 2024-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113947725A (en) | Hyperspectral image classification method based on convolution width migration network | |
CN111275092B (en) | Image classification method based on unsupervised domain adaptation | |
Jin et al. | Object-oriented method combined with deep convolutional neural networks for land-use-type classification of remote sensing images | |
CN112015863B (en) | Multi-feature fusion Chinese text classification method based on graphic neural network | |
CN111160553B (en) | Novel field self-adaptive learning method | |
CN111401426B (en) | Small sample hyperspectral image classification method based on pseudo label learning | |
CN111008644B (en) | Ecological change monitoring method based on local dynamic energy function FCN-CRF model | |
CN115410088B (en) | Hyperspectral image field self-adaption method based on virtual classifier | |
Wang et al. | Hyperspectral image classification based on domain adversarial broad adaptation network | |
CN107273919A (en) | A kind of EO-1 hyperion unsupervised segmentation method that generic dictionary is constructed based on confidence level | |
CN116597312A (en) | Crop leaf disease and pest identification method based on small sample image semantic segmentation | |
CN113570161B (en) | Method for constructing stirred tank reactant concentration prediction model based on width transfer learning | |
CN111797911A (en) | Image data multi-label classification method | |
CN113408652B (en) | Semi-supervised learning image classification method based on group representation features | |
Dai et al. | Research on hyper-spectral remote sensing image classification by applying stacked de-noising auto-encoders neural network | |
Dahiya et al. | A review on deep learning classifier for hyperspectral imaging | |
CN114049567B (en) | Adaptive soft label generation method and application in hyperspectral image classification | |
CN110222793B (en) | Online semi-supervised classification method and system based on multi-view active learning | |
Luo et al. | Anomaly detection for image data based on data distribution and reconstruction | |
CN101950363B (en) | SAR image monitoring and classifying method based on conditional random field model | |
CN115496948A (en) | Network supervision fine-grained image identification method and system based on deep learning | |
Molina et al. | A model-based deep transfer learning algorithm for phenology forecasting using satellite imagery | |
CN114494762A (en) | Hyperspectral image classification method based on deep migration network | |
CN113987170A (en) | Multi-label text classification method based on convolutional neural network | |
CN106127224A (en) | Based on the semi-supervised polarization SAR sorting technique quickly auctioning figure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |