CN113947725A - Hyperspectral image classification method based on convolution width migration network - Google Patents

Hyperspectral image classification method based on convolution width migration network Download PDF

Info

Publication number
CN113947725A
CN113947725A CN202111244668.7A CN202111244668A CN113947725A CN 113947725 A CN113947725 A CN 113947725A CN 202111244668 A CN202111244668 A CN 202111244668A CN 113947725 A CN113947725 A CN 113947725A
Authority
CN
China
Prior art keywords
domain
layer
network
adaptation
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111244668.7A
Other languages
Chinese (zh)
Other versions
CN113947725B (en
Inventor
程玉虎
王浩宇
王雪松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN202111244668.7A priority Critical patent/CN113947725B/en
Publication of CN113947725A publication Critical patent/CN113947725A/en
Application granted granted Critical
Publication of CN113947725B publication Critical patent/CN113947725B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the steps of firstly preprocessing an original hyperspectral image by using wave band selection to remove wave band redundancy; and then, adding a domain adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain at the same time, and obtaining a convolutional domain adaptation network to extract the invariance characteristics of the deep domain in the original hyperspectral data. And then proposing the maximum mean difference of the weighting conditions, adding the regular term based on the maximum mean difference of the weighting conditions into a width network to obtain a weighting condition width learning network so as to reduce the difference of the two-domain conditional probability distribution and the class weight deviation, and simultaneously carrying out width expansion on the features. And finally, rapidly calculating an output weight value through a ridge regression theory. The method can complete unsupervised classification of the hyperspectral image of the target domain only by using the source domain labeled sample.

Description

Hyperspectral image classification method based on convolution width migration network
Technical Field
The invention relates to the technical field of pattern recognition, in particular to a hyperspectral image classification method based on a convolution width migration network.
Background
The hyperspectral image contains abundant spectrum and spatial information and has the characteristic of 'map integration'. The characteristic makes it widely used in agriculture, mineral exploration, national defense safety and other fields. Today, hyperspectral image classification has become a research hotspot, which aims to deduce the category to which each pixel in an image belongs according to the hyperspectral image spectrum and spatial information. Many algorithms are proposed to improve the classification accuracy of hyperspectral images, including support vector machines, sparse representations, convolutional neural networks, and so on. The excellent classification performance of these supervised learning algorithms often requires a large number of labeled samples as support. However, a large amount of manpower and material resources are consumed for marking the hyperspectral data, and it is very difficult to obtain large-scale marked data. Therefore, how to learn a classification model with strong generalization capability at low labeling cost is a research hotspot in the field of hyperspectral image analysis nowadays. To solve the above problems, some machine learning algorithms are proposed, including active learning, semi-supervised learning, and data enhancement.
Unlike active learning, which actively reduces labeling costs, and semi-supervised learning, which obtains information from unlabeled samples, transfer learning is capable of transferring knowledge from the relevant domain to other domains, i.e., from a source domain to a target domain. When the target domain has no labeled samples or the labeled samples are insufficient in quantity, the migration learning can solve the classification problem of the target domain by using the samples of the same or similar labels in the source domain.
The deep learning has strong nonlinear representation capability and can extract advanced and compact features of input. The convolutional neural network is the most famous model for deep learning and has strong feature extraction capability. In view of the above advantages, the convolutional neural network-based migration learning model is very valuable for research.
A recently proposed width learning system is a forward neural network model consisting of only three layers, including an input layer, a middle layer, and an output layer. Compared with deep learning, the wide learning can realize the expansion of the features, and has the advantages of simple structure, high calculation speed, easy combination with other models and the like.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a hyperspectral image classification method based on a convolution width migration learning network, which can finish unsupervised classification of a hyperspectral image of a target domain only by using a source domain marker sample.
The technical scheme is as follows: the invention discloses a hyperspectral image classification method based on a convolution width migration network, which comprises the following steps of:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain data
Figure BDA0003320522750000021
And target domain data
Figure BDA0003320522750000022
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
Further, the hyperspectral image classification method based on the convolution width migration network further comprises the following steps,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
Further, step 2 specifically includes the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation items
Figure BDA0003320522750000023
Covariance field adaptation entry
Figure BDA0003320522750000024
Wherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, input X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
Figure BDA0003320522750000025
in the formula (I), the compound is shown in the specification,
Figure BDA0003320522750000026
for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
Further, step 3 specifically includes the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDAN
Figure BDA0003320522750000031
Wherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAiei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,
Figure BDA0003320522750000032
Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
Figure BDA0003320522750000033
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
Figure BDA0003320522750000034
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
Figure BDA0003320522750000035
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
Figure BDA0003320522750000036
Figure BDA0003320522750000041
equation (3) can be written as:
Figure BDA0003320522750000042
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
then the characteristics of the source and target domains in the MF can be computed as
Figure BDA0003320522750000043
And
Figure BDA0003320522750000044
and obtaining the weighted condition width learning network through the steps.
Further, step 4 specifically includes the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,
Figure BDA0003320522750000047
σ (.) is the tansig function,
Figure BDA0003320522750000045
is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
Further, in step 5, the output weight is calculated through a ridge regression theory, which specifically includes the following contents:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
Figure BDA0003320522750000046
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
Figure BDA0003320522750000051
the prediction result is as follows:
Yt=UtW
has the advantages that: the method can complete the unsupervised classification of the hyperspectral images of the target domain only by using the source domain labeled samples.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings.
As shown in fig. 1, the hyperspectral image classification method based on the convolution width migration learning network of the invention includes the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain data
Figure BDA0003320522750000052
And target domain data
Figure BDA0003320522750000053
Defining the HSI wave band number of the original hyperspectral image as NbD is the number of wave bands after dimensionality reduction, respectively
Figure BDA0003320522750000054
And
Figure BDA0003320522750000055
selecting a and b HSI bands for the number of intervals, wherein
Figure BDA0003320522750000056
Expressing the rounding-down operation, the following equation can be obtained:
Figure BDA0003320522750000057
defining reduced X0∈Rn×dAs input to the model, wherein X0Representing hyperspectral image data and n representing the number of samples.
Figure BDA0003320522750000058
And
Figure BDA0003320522750000059
respectively the source domain and target domain data after band selection, wherein nsRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, and aligning the edge probability distribution and the second-order statistic information of the source domain and the target domain to obtain a convolutional field adaptation network CDAN. And classifying the target domain data by using a convolution domain adaptive network CDAN to obtain a target domain pseudo label.
Step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation items
Figure BDA0003320522750000061
Covariance field adaptation entry
Figure BDA0003320522750000062
Wherein θ represents a network parameter; and obtaining the CDAN of the convolution domain adaptation network.
The convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 2.2, connecting the input to the convolution layer through a convolution kernel to obtain a feature mapping, wherein the calculation formula is as follows:
C=IC*KC+bC
in the formula ICFor convolutional layer input, as convolution operation, KCAs a convolution kernel, bCIs an offset. Connecting the convolutional layer output as an input to a nonlinear layer, the nonlinear layer output being:
Figure BDA0003320522750000063
in the formula INIs a non-linear layer input. Connecting the nonlinear layer output as an input to a pooling layer, the pooling layer output being:
FP=down(IP)
in the formula IPIs input into the pooling layer. Connecting the pooled layer output as input to the full link layer, the full link layer output being:
FF=,F×WF+bF
in the formula IFIs the input of the full connection layer.
And 2.2, inputting the features subjected to the feature extraction of the full connection layer into a field adaptation layer so as to reduce the distribution difference of a source field and a target field.
Step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
Figure BDA0003320522750000064
in the formula (I), the compound is shown in the specification,
Figure BDA0003320522750000065
for source domain data classification loss, α1And alpha2Edge domain adaptive parameters and covariance domain adaptive parameters, respectively.
And taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
And 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a field adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN. The method specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDAN
Figure BDA0003320522750000071
Wherein d is2Representing the dimension of X.
Taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAiei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,
Figure BDA0003320522750000072
Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMFor each set of feature node dimensions, Z represents a mapping feature.
Formula (1) is optimized as:
Figure BDA0003320522750000073
in the formula, λ represents a regular term coefficient.
Field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
Figure BDA0003320522750000074
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
Figure BDA0003320522750000075
in the formula, the class importance weight ωcAnd constructing according to the source domain label and the target domain pseudo label.
Figure BDA0003320522750000076
Figure BDA0003320522750000081
Equation (3) can be written as:
Figure BDA0003320522750000082
the above formula can be solved by an alternative direction multiplier method ADMM to obtain the weight A.
Further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
then the characteristics of the source and target domains in the MF can be computed as
Figure BDA0003320522750000083
And
Figure BDA0003320522750000084
and obtaining the weighted condition width learning network through the steps.
And 4, inputting the features extracted by the CDAN into the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, reducing the difference of two-domain conditional probability distribution and class weight deviation, and performing feature width expansion on the field invariance features. Further enhancing the feature representation capability.
Passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,
Figure BDA0003320522750000086
σ (.) is the tansig function,
Figure BDA0003320522750000085
is characterized by EN, dEIndicating the number of EN nodes.
The source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE)。ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
And 5: and (4) rapidly calculating an output weight value through a ridge regression theory.
And mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
Figure BDA0003320522750000091
in the formula of Us=[Zs|Hs]And δ is a regular term coefficient. Solving the above formula by using ridge regression theory to obtain:
Figure BDA0003320522750000092
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (6)

1. A hyperspectral image classification method based on a convolution width migration network is characterized by comprising the following steps:
step 1, carrying out dimensionality reduction on an original hyperspectral image by using wave band selection, removing wave band redundancy, and obtaining hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain data
Figure FDA0003320522740000011
And target domain data
Figure FDA0003320522740000012
And 2, constructing a field adaptation layer, adding the field adaptation layer into the convolutional neural network, aligning the edge probability distribution and the second-order statistic information of the source field and the target field at the same time to obtain a convolutional field adaptation network CDAN, and classifying the target field data by using the convolutional field adaptation network CDAN to obtain a target field pseudo label.
2. The hyperspectral image classification method based on the convolution width migration network according to claim 1, further comprising,
step 3, combining the source domain label, the target domain pseudo label and the maximum mean difference to construct a weighted condition maximum mean difference WCMMD, and adding a domain adaptation regular term based on the weighted condition maximum mean difference WCMMD into the MF mapping process to obtain a weighted condition width learning network WCBN;
step 4, inputting the features extracted by the CDAN to the WCBN, performing weighted conditional probability distribution alignment on the depth field invariance features extracted by the CDAN by using the WCBN, and performing feature width expansion on the field invariance features;
and 5, calculating an output weight value through a ridge regression theory.
3. The hyperspectral image classification method based on the convolution width migration network according to claim 1 or 2 is characterized in that the step 2 specifically comprises the following steps:
step 2.1, constructing a domain adaptation layer and adding the domain adaptation layer into a convolutional neural network, wherein the domain adaptation layer comprises: edge domain adaptation items
Figure FDA0003320522740000013
Covariance field adaptation entry
Figure FDA0003320522740000014
Wherein θ represents a network parameter; obtaining a convolution field adaptive network CDAN;
the convolution domain adaptation network CDAN comprises a convolution layer, a nonlinear layer, a pooling layer, a full-link layer, a domain adaptation layer and a classification layer which are sequentially connected;
step 22, inputting X by convolution kernel0Connecting to the convolution layer, and sequentially performing feature extraction through the convolution layer, the nonlinear layer, the pooling layer and the full-connection layer;
step 2.2, inputting the features extracted by the features of the full connection layer into a field adaptation layer;
step 2.3, the output of the domain adaptation layer is connected to the classification layer for classification,
the loss function of the convolutional domain adaptation network CDAN is defined as:
Figure FDA0003320522740000015
in the formula (I), the compound is shown in the specification,
Figure FDA0003320522740000021
for source domain data classification loss, α1And alpha2Respectively an edge domain adaptive parameter and a covariance domain adaptive parameter;
and taking the pretrained convolution field adaptation network CDAN as an auxiliary classifier, and obtaining the target field pseudo label by using the auxiliary classifier.
4. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 3 specifically comprises the following steps:
extraction of deep domain invariance features via convolutional domain adaptation network CDAN
Figure FDA0003320522740000022
Wherein d is2Represents the dimension of X;
taking the extracted feature X as an input, and weighting the feature X by weight AiMapping to the feature node, the ith group of mapping features MF is:
Zi=XAiei,i=1,...,dM (1)
in the formula (1), the reaction mixture is,
Figure FDA0003320522740000023
Aiand betaejRespectively representing the connection weights and deviations of the inputs X to MF, dMNumber of groups of characteristic nodes, GMZ represents mapping characteristics for each group of characteristic node dimensions;
formula (1) is optimized as:
Figure FDA0003320522740000024
in the formula, λ represents a regular term coefficient;
field adaptation regularization term D based on weighted condition maximum mean difference WCMMDcf(Ps,Pt) Adding the source domain and the target domain into an MF mapping process, and performing weighted adaptation on conditional probability distribution of the source domain and the target domain:
Figure FDA0003320522740000025
in the formula, gamma represents a coefficient of a field adaptation regular term, C is belonged to {1, 2., C } and is a category index, and the field adaptation regular term D iscf(Ps,Pt) Comprises the following steps:
Figure FDA0003320522740000026
in the formula, the class importance weight ωcConstructing according to the source domain label and the target domain pseudo label;
Figure FDA0003320522740000027
Figure FDA0003320522740000031
equation (3) can be written as:
Figure FDA0003320522740000032
the formula can be solved by an alternative direction multiplier method ADMM to obtain a weight A;
further, the required weight A can be obtainediThen Z isiCan be obtained by the following formula:
Zi=XAi
then the characteristics of the source and target domains in the MF can be computed as
Figure FDA0003320522740000033
And
Figure FDA0003320522740000034
and obtaining the weighted condition width learning network through the steps.
5. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that the step 4 specifically comprises the following steps:
passing the characteristics of MF through random weight WEMapping to an incremental node EN to realize feature width expansion of the domain invariance feature:
H=σ(ZWE)
in the formula (I), the compound is shown in the specification,
Figure FDA0003320522740000035
σ (.) is the tansig function,
Figure FDA0003320522740000036
is characterized by EN, dERepresents the number of EN nodes;
the source and target domain characteristics in an EN can be expressed as: hs=σ(ZsWE) And Ht=σ(ZtWE);ZsAnd HsIs a source domain feature of MF and EN, ZtAnd HtAre MF and EN target domain features.
6. The hyperspectral image classification method based on the convolution width migration network according to claim 2 is characterized in that in the step 5, the output weight is calculated through a ridge regression theory, and the method specifically comprises the following steps:
and mapping MF and EN to an input layer simultaneously, wherein a weighted condition width learning network WCBN objective function is as follows:
Figure FDA0003320522740000041
in the formula of Us=[Zs|Hs]δ is a regular term coefficient; solving the above formula by using ridge regression theory to obtain:
Figure FDA0003320522740000042
the prediction result is as follows:
Yt=UtW
in the formula of Ut=[Zt|Ht]。
CN202111244668.7A 2021-10-26 2021-10-26 Hyperspectral image classification method based on convolution width migration network Active CN113947725B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111244668.7A CN113947725B (en) 2021-10-26 2021-10-26 Hyperspectral image classification method based on convolution width migration network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111244668.7A CN113947725B (en) 2021-10-26 2021-10-26 Hyperspectral image classification method based on convolution width migration network

Publications (2)

Publication Number Publication Date
CN113947725A true CN113947725A (en) 2022-01-18
CN113947725B CN113947725B (en) 2024-06-14

Family

ID=79332583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111244668.7A Active CN113947725B (en) 2021-10-26 2021-10-26 Hyperspectral image classification method based on convolution width migration network

Country Status (1)

Country Link
CN (1) CN113947725B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114723994A (en) * 2022-04-18 2022-07-08 中国矿业大学 Hyperspectral image classification method based on dual-classifier confrontation enhancement network
CN116150668A (en) * 2022-12-01 2023-05-23 中国矿业大学 Rotating equipment fault diagnosis method based on double-stage alignment partial migration network
CN118247584A (en) * 2024-05-28 2024-06-25 安徽大学 Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344891A (en) * 2018-09-21 2019-02-15 北京航空航天大学 A kind of high-spectrum remote sensing data classification method based on deep neural network
CN109359623A (en) * 2018-11-13 2019-02-19 西北工业大学 High spectrum image based on depth Joint Distribution adaptation network migrates classification method
US20200130177A1 (en) * 2018-10-29 2020-04-30 Hrl Laboratories, Llc Systems and methods for few-shot transfer learning
CN111709448A (en) * 2020-05-20 2020-09-25 西安交通大学 Mechanical fault diagnosis method based on migration relation network
AU2020103905A4 (en) * 2020-12-04 2021-02-11 Chongqing Normal University Unsupervised cross-domain self-adaptive medical image segmentation method based on deep adversarial learning
CN112906756A (en) * 2021-01-28 2021-06-04 昆山则立诚智能设备有限公司 High-image classification method and system for cross-channel quantity transfer learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344891A (en) * 2018-09-21 2019-02-15 北京航空航天大学 A kind of high-spectrum remote sensing data classification method based on deep neural network
US20200130177A1 (en) * 2018-10-29 2020-04-30 Hrl Laboratories, Llc Systems and methods for few-shot transfer learning
CN109359623A (en) * 2018-11-13 2019-02-19 西北工业大学 High spectrum image based on depth Joint Distribution adaptation network migrates classification method
CN111709448A (en) * 2020-05-20 2020-09-25 西安交通大学 Mechanical fault diagnosis method based on migration relation network
AU2020103905A4 (en) * 2020-12-04 2021-02-11 Chongqing Normal University Unsupervised cross-domain self-adaptive medical image segmentation method based on deep adversarial learning
CN112906756A (en) * 2021-01-28 2021-06-04 昆山则立诚智能设备有限公司 High-image classification method and system for cross-channel quantity transfer learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
樊帅昌;易晓梅;李剑;惠国华;郜园园;: "基于深度残差网络与迁移学习的毒蕈图像识别", 传感技术学报, no. 01, 15 January 2020 (2020-01-15) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114723994A (en) * 2022-04-18 2022-07-08 中国矿业大学 Hyperspectral image classification method based on dual-classifier confrontation enhancement network
CN116150668A (en) * 2022-12-01 2023-05-23 中国矿业大学 Rotating equipment fault diagnosis method based on double-stage alignment partial migration network
CN116150668B (en) * 2022-12-01 2023-08-11 中国矿业大学 Rotating equipment fault diagnosis method based on double-stage alignment partial migration network
CN118247584A (en) * 2024-05-28 2024-06-25 安徽大学 Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium
CN118247584B (en) * 2024-05-28 2024-08-09 安徽大学 Hyperspectral image domain generalization method, hyperspectral image domain generalization device, hyperspectral image domain generalization system and storage medium

Also Published As

Publication number Publication date
CN113947725B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
CN113947725A (en) Hyperspectral image classification method based on convolution width migration network
CN111275092B (en) Image classification method based on unsupervised domain adaptation
Jin et al. Object-oriented method combined with deep convolutional neural networks for land-use-type classification of remote sensing images
CN112015863B (en) Multi-feature fusion Chinese text classification method based on graphic neural network
CN111160553B (en) Novel field self-adaptive learning method
CN111401426B (en) Small sample hyperspectral image classification method based on pseudo label learning
CN111008644B (en) Ecological change monitoring method based on local dynamic energy function FCN-CRF model
CN115410088B (en) Hyperspectral image field self-adaption method based on virtual classifier
Wang et al. Hyperspectral image classification based on domain adversarial broad adaptation network
CN107273919A (en) A kind of EO-1 hyperion unsupervised segmentation method that generic dictionary is constructed based on confidence level
CN116597312A (en) Crop leaf disease and pest identification method based on small sample image semantic segmentation
CN113570161B (en) Method for constructing stirred tank reactant concentration prediction model based on width transfer learning
CN111797911A (en) Image data multi-label classification method
CN113408652B (en) Semi-supervised learning image classification method based on group representation features
Dai et al. Research on hyper-spectral remote sensing image classification by applying stacked de-noising auto-encoders neural network
Dahiya et al. A review on deep learning classifier for hyperspectral imaging
CN114049567B (en) Adaptive soft label generation method and application in hyperspectral image classification
CN110222793B (en) Online semi-supervised classification method and system based on multi-view active learning
Luo et al. Anomaly detection for image data based on data distribution and reconstruction
CN101950363B (en) SAR image monitoring and classifying method based on conditional random field model
CN115496948A (en) Network supervision fine-grained image identification method and system based on deep learning
Molina et al. A model-based deep transfer learning algorithm for phenology forecasting using satellite imagery
CN114494762A (en) Hyperspectral image classification method based on deep migration network
CN113987170A (en) Multi-label text classification method based on convolutional neural network
CN106127224A (en) Based on the semi-supervised polarization SAR sorting technique quickly auctioning figure

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant