CN104008394B - Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle - Google Patents

Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle Download PDF

Info

Publication number
CN104008394B
CN104008394B CN201410213709.XA CN201410213709A CN104008394B CN 104008394 B CN104008394 B CN 104008394B CN 201410213709 A CN201410213709 A CN 201410213709A CN 104008394 B CN104008394 B CN 104008394B
Authority
CN
China
Prior art keywords
matrix
sample
regular
marker samples
represent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410213709.XA
Other languages
Chinese (zh)
Other versions
CN104008394A (en
Inventor
杨淑媛
焦李成
冯志玺
刘芳
缑水平
侯彪
王爽
杨丽霞
邓晓政
任宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410213709.XA priority Critical patent/CN104008394B/en
Publication of CN104008394A publication Critical patent/CN104008394A/en
Application granted granted Critical
Publication of CN104008394B publication Critical patent/CN104008394B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a semi-supervision hyperspectral data dimension descending method based on a largest neighbor boundary principle. The method mainly solves the problems that in the prior art, a large amount of supervision information is needed and data discrimination is poor after dimension descending. The method includes the steps that (1) a remote sensing database sample set is divided into a training data set and a mark sample set, (2) a divergence matrix of the mark sample set is generated, (3) a space neighbor matrix of the training data set is generated, (4) a similarity matrix of the training data set is generated, (5) a semi-supervision discrimination item is built by means of the largest boundary principle according to the divergence matrix, (6) a semi-supervision regular term is built, and (7) an optimal projection matrix is obtained by minimizing the sum of the discrimination item and the regular item and accordingly dimension descending is achieved. By means of manifold regularity expressed by a low rank and space regularity of space consistence, the regular item is built; by means of the regular strategy of space and spectrum combination, robustness and completeness of the projection matrix are achieved, data discrimination performance is improved after dimension descending, and the method can be used for classified recognition of hyperspectral data.

Description

Based on the semi-supervised high-spectral data dimension reduction method that neighbour border is maximum
Technical field
The invention belongs to technical field of image processing, further relate to a kind of Method of Data with Adding Windows, can be used for remote sensing shadow Dimensionality reduction and classification as data.
Background technology
High spectrum resolution remote sensing technique through being successfully applied to national defense safety, environmental monitoring, the field such as resource exploration, has been modern One of high-tech technology, but, the development of the data processing technique of target in hyperspectral remotely sensed image then relatively lags behind and sets in video imaging The standby development waiting hardware aspect, this makes the further popularization and application of high spectrum resolution remote sensing technique be restricted.Classification is to bloom The terrestrial object information that spectrum remote sensing image enriches is analyzed and the important channel to satellite images interpretation, therefore, to high-spectrum remote-sensing The research of data terrain classification has highly important practical value.
Target in hyperspectral remotely sensed image comprises abundant ground object space, radiation and spectral information, have high light spectral resolution and The advantages of spatial resolution, and, the spectrum of atural object property will be determined and determine that ground object space is organic with the image of geometrical property Be combined together, be conducive to carrying out classification and the target identification of atural object.
High-spectrum remote sensing data provides abundant terrestrial object information, but, while bringing abundant information, its magnanimity High dimensional data also proposes very big challenge to traditional sorting algorithm.On the one hand, the classification for target in hyperspectral remotely sensed image is calculated Method, Supervised classification algorithm needs fairly large marker samples, and otherwise nicety of grading is very low, furthermore the mass data of higher-dimension is Data training study brings huge time complexity and computation complexity, therefore, improves classification while reducing operand Precision, it has also become the study hotspot in high-spectrum remote-sensing field.On the other hand, there is height in high-spectral data between adjacent band Redundancy, therefore, high-spectrum remote sensing data is being carried out using pre-processing to data before, is reducing redundancy, no The dimension of data only can be reduced, process for follow-up classification and reduce amount of calculation, and also more robust, accurately can be obtained Classification results.
Existing classical dimension reduction method mainly has following three classes:
(1) unsupervised dimension reduction method:As principal component analysis PCA, it is by maximum variance theory, by data projection to The direction of big variance.Due to not having supervision message, the data after dimensionality reduction does not have good differentiation performance to this method.
(2) supervise dimension reduction method:As linear discriminant analysis LDA, by maximizing class scatter matrix and minimizing in class The ratio of Scatter Matrix and obtain projection matrix, LDA than PCA have preferable differentiate performance, but the highest dimension after LDA dimensionality reduction For c-1, and be not suitable for the dimensionality reduction of non-gaussian distribution data so that universality is deteriorated, wherein c is sample class number.
(3) semi-supervised dimension reduction method:By a large amount of unmarked samples come the structure of learning data, it is unsupervised learning Emphasis.Existing semi-supervised method is concentrated mainly on the manifold canonical of data, does not account for the global structure of data, and ignores The spatial information of image data is so that the spatial structural form of image data can not effectively utilize.
Content of the invention
Present invention aims to the deficiency of above-mentioned prior art, a kind of half prison maximum based on neighbour border is proposed Superintend and direct dimension reduction method, using a small amount of supervision message, realize the efficient dimensionality reduction to high-spectrum remote sensing data.
The technical scheme realizing the object of the invention is:Obtain structure and the Space Consistency of data by low-rank representation canonical Constraint obtains the spatial structural form of image, and then obtains optimum projection matrix by Eigenvalues Decomposition, realizes the dimensionality reduction of data. Concrete steps include as follows:
(1) Remote Sensing Image Database sample set is divided into training dataset X and marker samples collection Y;
(2) generate the Scatter Matrix of marker samples collection:
The similar Scatter Matrix of marker samples collection 2a) is generated by similar Scatter Matrix formula:
Wherein C represents similar Scatter Matrix, yi, yjRepresent i-th and j-th marker samples respectively,Represent i-th mark The homogeneity neighbour set of note sample, | | represent the number of element in set, the transposition of T representing matrix;
The different Scatter Matrix of marker samples collection 2b) is generated by different Scatter Matrix formula:
Wherein S represents different Scatter Matrix, yi, ykRepresent i-th and k-th marker samples respectively,Represent i-th mark Heterogeneous neighbour's set of note sample;
(3) the spatial neighbors matrix of training dataset is generated by spatial neighbors relation:
Wherein L representation space neighbour matrix, L (i, j) represents the i-th row jth column element of L matrix, xi, xjRepresent i-th respectively Individual and j-th training sample;
(4) solve the low-rank representation of training dataset by non-precision augmented vector approach, generate training data The similarity matrix Z of collection;
(5) according to similar Scatter Matrix C and different Scatter Matrix S, semi-supervised differentiation item is constructed by maximum boundary criterion: J (W)=tr (WT(S-C) W), and define R=S-C for discrimination matrix,
Wherein, W represents optimum projection matrix, the mark of tr () representing matrix, the transposition of T representing matrix;
(6) build semi-supervised regular terms:
6a) according to spatial neighbors matrix L, Space Consistency regular terms is built by locally coherence constrained procedure;
Define R1=X (L- ΔL)XTFor Space Consistency regular matrix,
Wherein, L (i, j) represents the i-th row jth column element of neighbour's matrix L, ΔLRepresent diagonal matrix, its diagonal element is The sum of every a line, that is,
6b) according to similarity matrix Z, manifold regular terms is constructed by sparse holding criterion:
DefinitionFor manifold regular matrix,
Wherein ziRepresent i-th row of similarity matrix Z, I represents unit matrix, and M represents the number of training sample;
(7) solve optimum projection matrix:
According to discrimination matrix R, Space Consistency regular matrix R1With manifold regular matrix R2, obtain objective matrix U=R- λ1R12R2
Eigenvalues Decomposition is carried out to objective matrix U, and characteristic value is sorted from big to small, take front m characteristic value corresponding Characteristic vector composition optimum projection matrix W, wherein λ1For Space Consistency regular parameter, λ2For manifold regular parameter, m represents The dimension of training set sample and test set sample after dimensionality reduction.
Compared with prior art, the invention has the advantages that:
The present invention to construct semi-supervised regular terms using the manifold canonical of low-rank representation and the space canonical of Space Consistency, Generate the manifold of training dataset and the structure canonical in space, in the case of little supervision message, united just using empty spectrum Then strategy, so that projection matrix more robust, complete, improves the differentiation performance of data after dimensionality reduction.
Brief description
Fig. 1 is the flow chart of the present invention;
Fig. 2 is spatial neighbors graph of a relation;
Fig. 3 is the experiment high-spectral data IndianPines and its authentic signature figure that present invention emulation uses;
Fig. 4 is the classification results mark figure with the present invention when marker samples number is 5 and 8.
Specific embodiment
With reference to Fig. 1, the present invention is described in further detail.
Step 1:Remote Sensing Image Database sample set is divided into training dataset X and marker samples collection Y.
1a) in remote sensing image data sample set, the data of random selection 40% is as training data X ∈ RD×M, remaining 60% data is as test sample data set T ∈ RD×T, wherein, D represents training set sample and the dimension of test set sample, RnTable Show n dimension real number space, M is the sum of training set sample, and T is the sum of test set sample;Embodiment in the present invention In IndianPines data set, sample dimension D is 200, and the sum M of training set sample is 4147;
1b) in training dataset X, every class randomly selects k sample and constitutes the marker samples collection Y ∈ R having supervision messageD ×Q, wherein Q=c × k, c are classification number, in the embodiment IndianPines data set of the present invention, c take for 16, k 5,6, 8};
1c) in marker samples collection Y, to each marker samples yiIts homogeneity neighbour set is calculated by Euclidean distanceWith Heterogeneous neighbour's set
Step 2:Generate the Scatter Matrix of marker samples collection.
The similar Scatter Matrix of marker samples collection 2a) is generated by similar Scatter Matrix formula:
Wherein, C represents similar Scatter Matrix, yi, yjRepresent i-th and j-th marker samples respectively, | | represent in set The number of element, the transposition of T representing matrix;
The different Scatter Matrix of marker samples collection 2b) is generated by different Scatter Matrix formula:
Wherein, S represents different Scatter Matrix, yi, ykRepresent i-th and k-th marker samples respectively;
Step 3:Generate the spatial neighbors matrix of training dataset by spatial neighbors relation.
The spatial neighbors relation being given with reference to Fig. 2, to each marker samples xiDetermine its spatial neighbors sample, if sample xjIt is sample xiSpatial neighbors, then L (i, j)=1, otherwise L (i, j)=0, therefore generate training dataset spatial neighbors square Battle array be:
Wherein, L representation space neighbour matrix, L (i, j) represents the i-th row jth column element of L matrix, xi, xjRepresent the respectively I and j-th training sample.
Step 4:Generate the similarity matrix Z of training dataset.
By the low-rank representation coefficient of non-precision augmented vector approach iterative training dataset, take low-rank table Show the absolute value of coefficient, as the similarity matrix Z of training dataset.
Step 5:According to similar Scatter Matrix C and different Scatter Matrix S, semi-supervised differentiation is constructed by maximum boundary criterion ?:J (W)=tr (WT(S-C) W), and define R=S-C for discrimination matrix,
Wherein, W represents optimum projection matrix, the mark of tr () representing matrix;
Step 6:Build semi-supervised regular terms.
6a) according to spatial neighbors matrix L, Space Consistency regular terms is built by locally coherence constrained procedure:
Define R1=X (L- ΔL)XTFor Space Consistency regular matrix,
Wherein, L (i, j) represents the i-th row jth column element of L matrix, ΔLRepresent diagonal matrix, its diagonal element is each The sum of row, that is,
6b) according to similarity matrix Z, by sparse holding criteria construction manifold regular terms:
DefinitionFor manifold regular matrix,
Wherein, ziRepresent i-th row of similarity matrix Z, I represents unit matrix.
Step 7:Solve optimum projection matrix.
According to discrimination matrix R, Space Consistency regular matrix R1With manifold regular matrix R2, obtain objective matrix U=R- λ1R12R2
Eigenvalues Decomposition is carried out to objective matrix U, and characteristic value is sorted from big to small, take front m characteristic value corresponding Characteristic vector composition optimum projection matrix W, wherein λ1For Space Consistency regular parameter, differentiate item and space one for balancing Weight between cause property regular terms, λ2For manifold regular parameter, for balancing the weight differentiating between item and manifold regular terms, m The dimension of training set sample and test set sample after expression dimensionality reduction, in the embodiment IndianPines data set of the present invention,M is more than the number of 0 characteristic value for objective matrix U.
The effect of the present invention can be further illustrated by following emulation experiment.
1. emulation experiment condition.
This experiment adopts IndianPines data set as experimental data, using software MATLAB7.10.0 as emulation Instrument, allocation of computer is Intel Core i5/2.27G/2G.
IndianPines high-spectral data 92AV3C:The print that this scene obtains in June, 1992 for AVIRIS sensor The IndianPines test ground of the An Na state northwestward, this size of data is 145 × 145, and each pixel has 220 wave bands, removes Containing noisy 20 wave bands, only retain remaining 200 wave bands, this data comprises 16 class atural objects altogether, and Fig. 3 (a) gives IndianPines high-spectral data, Fig. 3 (b) gives the authentic signature figure of IndianPines high-spectral data.
2. emulation experiment content.
Emulation 1, carries out imitative under different marker samples numbers on the IndianPines high-spectral data that Fig. 3 (a) is given True experiment, and the inventive method is carried out with existing following four dimension reduction method under the authentic signature that Fig. 3 (b) is given right Than:1) the semi-supervised dimensionality reduction SSDRsp based on sparse holding;2) the semi-supervised dimensionality reduction SSDR based on paired constraint;3) locally Fisher discriminant analysis LFDA;4) principal component analysis PCA.
In experiment, Space Consistency regular parameter of the present inventionManifold regular parameter In table, OA represents overall accuracy, and AA represents mean accuracy, and Kappa represents Kappa coefficient.
When table 1 gives marker samples number and takes { 5,6,8 } respectively, with nearest neighbor classifier, 30 are carried out to data after dimensionality reduction The Experimental comparison results of secondary emulation classification.
Table 1:Present invention comparing result under different marker samples numbers from existing method
As seen from Table 1, the present invention every class mark number of samples be { 5,6,8 } when, five kinds of listed in table sides of precision It is highest in method, and variance is minimum, therefore there is preferable robustness.
Emulation 2, to when every class mark number of samples is 5 and 8, is carried out to the data after dimensionality reduction with nearest neighbor classifier point Class.Result such as Fig. 4, wherein Fig. 4 (a) are the classification results mark figures that marker samples number is when 5, and Fig. 4 (b) is marker samples Number is classification results mark figure when 8.
As seen from Figure 4, the present invention in the case of a small amount of supervision message it is possible to obtain preferable image space structure Uniformity is it was demonstrated that effectiveness of the invention.

Claims (2)

1. a kind of semi-supervised high-spectral data dimension reduction method maximum based on neighbour border, comprises the following steps
(1) Remote Sensing Image Database sample set is divided into training dataset X and marker samples collection Y;
(2) generate the Scatter Matrix of marker samples collection:
The similar Scatter Matrix of marker samples collection 2a) is generated by similar Scatter Matrix formula:
C = Σ i , j : y j ∈ N i o ( y i - y j ) ( y i - y j ) T | N i o | ,
Wherein, C represents similar Scatter Matrix, yi, yjRepresent i-th and j-th marker samples respectively, | | represent unit in set The number of element, the transposition of T representing matrix,Represent the homogeneity neighbour set of i-th marker samples;
The different Scatter Matrix of marker samples collection 2b) is generated by different Scatter Matrix formula:
S = Σ i , k : y k ∈ N i e ( y i - y k ) ( y i - y k ) T | N i e | ,
Wherein, S represents different Scatter Matrix, yi, ykRepresent i-th and k-th marker samples respectively, | | represent unit in set The number of element, the transposition of T representing matrix,Represent heterogeneous neighbour's set of i-th marker samples;
(3) the spatial neighbors matrix of training dataset is generated by spatial neighbors relation:
Wherein, L representation space neighbour matrix, L (i, j) represents the i-th row jth column element of L matrix, xi, xjRepresent i-th respectively With j-th training sample;
(4) solve the low-rank representation of training dataset by non-precision augmented vector approach, generate training dataset Similarity matrix Z;
(5) according to similar Scatter Matrix C and different Scatter Matrix S, semi-supervised differentiation item is constructed by maximum boundary criterion:J(W) =tr (WT(S-C) W), and define R=S-C for discrimination matrix,
Wherein, W represents optimum projection matrix, the mark of tr () representing matrix, the transposition of T representing matrix;
(6) build semi-supervised regular terms:
6a) according to spatial neighbors matrix L, Space Consistency regular terms is built by locally coherence constrained procedure;
J R 1 ( W ) = - 1 2 Σ i , j ( W T x i - W T x j ) 2 L ( i , j ) = t r ( W T X ( L - Δ L ) X T W ) ,
Define R1=X (L- ΔL)XTFor Space Consistency regular matrix,
Wherein L (i, j) represents the i-th row jth column element of L matrix, xiRepresent i-th training sample, ΔLRepresent diagonal matrix, its Diagonal element is the sum of every a line, that is,
6b) according to similarity matrix Z, by sparse holding criteria construction manifold regular terms:
J R 2 ( w ) = - 1 M Σ i | | W T x i - W T Xz i | | 2 = t r ( W T X ( 1 M ( Z T + Z - Z T Z - I ) X T W ) ,
DefinitionFor manifold regular matrix,
Wherein ziRepresent i-th row of similarity matrix Z, I represents unit matrix, and M represents the number of training sample;
(7) solve optimum projection matrix:
According to discrimination matrix R, Space Consistency regular matrix R1With manifold regular matrix R2, obtain objective matrix U=R- λ1R12R2
Eigenvalues Decomposition is carried out to objective matrix U, and characteristic value is sorted from big to small, take the corresponding spy of front m characteristic value Levy vector composition optimum projection matrix W, wherein λ1For Space Consistency regular parameter, λ2For manifold regular parameter, m represents dimensionality reduction The dimension of training set sample and test set sample afterwards.
2. the semi-supervised high-spectral data dimension reduction method maximum based on neighbour border according to claim 1, wherein, step (1) Remote Sensing Image Database sample set is divided into training dataset X and marker samples collection Y described in, carries out as follows:
1a) pending remotely-sensed data is concentrated, randomly choose 40% data composing training sample data set X ∈ RD×M, remaining 60% data is as test sample data set T ∈ RD×T, wherein, D represents training set sample and the dimension of test set sample, RnTable Show n dimension real number space, M represents the sum of training set sample, and T represents the sum of test set sample;
1b) in training dataset X, every class randomly selects k sample and constitutes the marker samples collection Y ∈ R having supervision messageD×Q, its Middle Q=c × k, c are classification number;
1c) in marker samples collection Y, to each marker samples yiIts homogeneity neighbour set is calculated by Euclidean distanceWith heterogeneous Neighbour gathers
CN201410213709.XA 2014-05-20 2014-05-20 Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle Expired - Fee Related CN104008394B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410213709.XA CN104008394B (en) 2014-05-20 2014-05-20 Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410213709.XA CN104008394B (en) 2014-05-20 2014-05-20 Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle

Publications (2)

Publication Number Publication Date
CN104008394A CN104008394A (en) 2014-08-27
CN104008394B true CN104008394B (en) 2017-02-15

Family

ID=51369043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410213709.XA Expired - Fee Related CN104008394B (en) 2014-05-20 2014-05-20 Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle

Country Status (1)

Country Link
CN (1) CN104008394B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105023239B (en) * 2015-08-18 2018-03-13 西安电子科技大学 The high-spectral data dimension reduction method being distributed based on super-pixel and maximum boundary
CN105866040B (en) * 2016-03-25 2019-02-19 华南农业大学 Bacterial blight of rice high-spectrum image dimensionality reduction method based on profile plot
CN106067042B (en) * 2016-06-13 2019-02-15 西安电子科技大学 Polarization SAR classification method based on semi-supervised depth sparseness filtering network
CN106886793B (en) * 2017-01-23 2020-02-07 西安电子科技大学 Hyperspectral image waveband selection method based on discrimination information and manifold information
CN111325275B (en) * 2020-02-20 2023-05-23 南京审计大学 Robust image classification method and device based on low-rank two-dimensional local identification map embedding
CN112150396B (en) * 2020-10-15 2023-07-25 武汉轻工大学 Hyperspectral image dimension reduction method and device, terminal equipment and storage medium
CN112836671B (en) * 2021-02-26 2024-03-08 西北工业大学 Data dimension reduction method based on maximized ratio and linear discriminant analysis

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102902984A (en) * 2012-09-27 2013-01-30 西安电子科技大学 Remote-sensing image semi-supervised projection dimension reducing method based on local consistency
CN103544507A (en) * 2013-10-15 2014-01-29 中国矿业大学 Method for reducing dimensions of hyper-spectral data on basis of pairwise constraint discriminate analysis and non-negative sparse divergence

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080008372A1 (en) * 2006-07-07 2008-01-10 General Electric Company A method and system for reducing artifacts in a tomosynthesis imaging system
CA2723367A1 (en) * 2008-05-16 2009-11-19 Calgary Scientific Inc. Image texture characterization of medical images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102902984A (en) * 2012-09-27 2013-01-30 西安电子科技大学 Remote-sensing image semi-supervised projection dimension reducing method based on local consistency
CN103544507A (en) * 2013-10-15 2014-01-29 中国矿业大学 Method for reducing dimensions of hyper-spectral data on basis of pairwise constraint discriminate analysis and non-negative sparse divergence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图的半监督维数约减算法研究及其应用;周楠;《中国优秀硕士学位论文全文数据库 信息科技辑》;20130415(第4期);正文第1-64页 *

Also Published As

Publication number Publication date
CN104008394A (en) 2014-08-27

Similar Documents

Publication Publication Date Title
CN104008394B (en) Semi-supervision hyperspectral data dimension descending method based on largest neighbor boundary principle
Wu et al. ORSIm detector: A novel object detection framework in optical remote sensing imagery using spatial-frequency channel features
Zhang et al. Hyperspectral classification based on lightweight 3-D-CNN with transfer learning
Wang et al. Robust hyperspectral unmixing with correntropy-based metric
CN104123555A (en) Super-pixel polarimetric SAR land feature classification method based on sparse representation
CN103440508B (en) The Remote Sensing Target recognition methods of view-based access control model word bag model
CN109034213B (en) Hyperspectral image classification method and system based on correlation entropy principle
Hu et al. A comparative study of sampling analysis in scene classification of high-resolution remote sensing imagery
CN106446935A (en) Kernel sparse representation and spatial constraint-based polarimetric SAR image classification method
Deng et al. Citrus disease recognition based on weighted scalable vocabulary tree
Liu et al. Remote sensing image classification algorithm based on texture feature and extreme learning machine
CN105069471B (en) High-spectral data subspace projection based on fuzzy label and sorting technique
Liu et al. Feature extraction for hyperspectral remote sensing image using weighted PCA-ICA
Liu et al. Kernel low-rank representation based on local similarity for hyperspectral image classification
Chen et al. High-level feature selection with dictionary learning for unsupervised SAR imagery terrain classification
Dang et al. Spectral-spatial attention transformer with dense connection for hyperspectral image classification
CN106056131A (en) Image feature extraction method based on LRR-LDA
Jiang et al. Hyperspectral image supervised classification via multi-view nuclear norm based 2D PCA feature extraction and kernel ELM
Wang et al. A lightweight and stochastic depth residual attention network for remote sensing scene classification
Fu et al. Optimization of distributed convolutional neural network for image labeling on asynchronous GPU model
CN102136067B (en) Cayley-Menger determinant-based hyperspectral remote sensing image end member extracting method
CN111985501B (en) Hyperspectral image feature extraction method based on self-adaptive high-order tensor decomposition
CN106033545B (en) Wave band selection method of determinant point process
Li et al. HTDFormer: Hyperspectral Target Detection Based on Transformer With Distributed Learning
Du et al. Hyperspectral image change detection based on intrinsic image decomposition feature extraction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170215

CF01 Termination of patent right due to non-payment of annual fee