CN108805061A - Hyperspectral image classification method based on local auto-adaptive discriminant analysis - Google Patents

Hyperspectral image classification method based on local auto-adaptive discriminant analysis Download PDF

Info

Publication number
CN108805061A
CN108805061A CN201810537158.0A CN201810537158A CN108805061A CN 108805061 A CN108805061 A CN 108805061A CN 201810537158 A CN201810537158 A CN 201810537158A CN 108805061 A CN108805061 A CN 108805061A
Authority
CN
China
Prior art keywords
pixel
matrix
class
training set
dimensionality reduction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810537158.0A
Other languages
Chinese (zh)
Inventor
王�琦
李学龙
孟照铁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201810537158.0A priority Critical patent/CN108805061A/en
Publication of CN108805061A publication Critical patent/CN108805061A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2132Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on discrimination criteria, e.g. discriminant analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of hyperspectral image classification method (LADA) based on local auto-adaptive discriminant analysis.First, construction obtains local auto-adaptive similarity weight matrix, to measure the similarity of high-spectral data spectrum channel feature;Then, Scatter Matrix and class scatter matrix in class are calculated separately, and regularization Scatter Matrix is built using the pixel of the small spatial neighborhood of test set pixel, with utmostly retaining space neighborhood information;Then, with the thought of regularization linear discriminant analysis, regularization unit matrix is introduced, the analytic solutions of transformed matrix is iteratively solved, avoids ill-conditioning problem, and avoid excessive dimensionality reduction problem;Finally, classified using feature after dimensionality reduction and calculate nicety of grading.Since the low-dimensional feature after dimensionality reduction includes abundant spectrum and spatial information, higher classification accuracy still can be obtained in the case where data distribution is unknowable.

Description

Hyperspectral image classification method based on local auto-adaptive discriminant analysis
Technical field
The invention belongs to Hyperspectral imagery processing technical fields, and in particular to a kind of height based on local auto-adaptive discriminant analysis Spectrum picture sorting technique.
Background technology
With the development of sensor technology, the pixel in high spectrum image can record hundreds and thousands of a spectrum channels, Correctly classified to each pixel, it will be able to accurately determine each region in image and correspond to the object kind in nature Class.However, the limitation of high dimensional feature and marker samples (pixel) lazy weight due to high spectrum image pixel, will realize Correctly classifying to it has prodigious challenge.For the research of above-mentioned difficult point, mainly there is two major classes method:
The first kind is the algorithm based on core, and core concept is that original high dimensional feature is mapped to more high-dimensional feature Space, to make inhomogeneous characteristic line after mapping that can divide.Such algorithm can successfully manage higher-dimension marker samples quantity not The problem of foot.For example, F.Melgani et al. is in document " Classification of hyperspectral remote sensing images with support vector machines,IEEE Trans.Geosci.Remote Sens., Use support vector machines as classification hyperspectral imagery device in vol.42, no.8, pp.1778-1790, Aug.2004 ", using not Primitive character is mapped to more high-dimensional feature space by same kernel function, to verify classification performance.
Second class is the algorithm of feature based dimensionality reduction, and core concept is that high dimensional data is mapped to low-dimensional feature son sky Between, while ensureing the loss for having the internal information of identification not big in original feature space.Wherein, Principal Component Analysis Algorithm (PCA) and linear discriminant analysis algorithm (LDA) is two kinds of algorithms being widely used.PCA is a kind of unsupervised dimension reduction method, By orthogonal transformation by one group of Feature Conversion be one group linearly incoherent feature, transformed this group of feature are referred to as principal component; While LDA is to minimize inter- object distance maximize between class distance as target, learn weight matrix, by with initial characteristic data Matrix operation is done, the feature after dimensionality reduction is obtained.Feature after conversion or dimensionality reduction is used for subsequent classification task.LDA, which is used as, prison Its performance of dimension-reduction algorithm is superintended and directed better than PCA, but still there are some defects:First, the LDA algorithm of standard cannot handle morbid state and ask Topic, i.e., when training samples number is less than sample dimension, discrete matrix is irreversible in class, LDA intangibilities;Second, LDA existed Spend dimensionality reduction problem, it is assumed that training sample has C classification, LDA that can at most drop to C-1 dimensions, when data category is less, too low dimension Degree can cause classification performance drastically to decline;Third, LDA can obtain optimal solution when processing meets the data of Gaussian Profile, but by In the localized flow patterns structure that it fails to make full use of data, when handling more complicated data distribution, effect is poor.
In order to solve problem above, T.V.Bandos et al. is in document " Classification of hyperspectral images with regularized linear discriminant analysis,IEEE Trans.Geosci.Remote Sens., RLDA (Regularized Linear are proposed in vol.47, no.3, pp.862-873, Mar.2009. " Discriminant Analysis, regularization linear discriminant analysis) method, by the target letter that regular terms is introduced into LDA Number, ensures in the parameter γ > 0 of regular terms, Scatter Matrix (S in classw+ γ I) it is inevitable reversible so that the morbid state in standard LDA Problem is handled.But this method still can not cope with data distribution not the case where being Gaussian Profile.H.Yuan et al. is in document “Spectral-spatial classification of hyperspectral image based on discriminant analysis,IEEE J.Sel.Topics Appl.Earth Observ.Remote Sens.,vol.7,no.6,pp.2035– Dimensionality reduction is carried out to high spectrum image feature using RLDA in 2043, Jun.2014. ", it is usual according to the pixel in small neighbourhood Belong to of a sort it is assumed that the spatial neighborhood Scatter Matrix regular terms new as RLDA object functions is constructed, to after dimensionality reduction Proper subspace in retain original space neighborhood information.Treated that feature obtains preferably in classification later for this method Performance, but its still be based on data Gaussian distributed it is assumed that can not thus make full use of the localized flow patterns of data Structure is unable to get the stronger proper subspace of identification.
Invention content
To overcome the shortcomings of that existing LDA classes method, the present invention propose a kind of bloom based on local auto-adaptive discriminant analysis Image classification method (LADA) is composed, to further increase the accuracy of classification hyperspectral imagery.First, it is adaptive to obtain part for construction Similarity weight matrix is answered, to measure the similarity of high-spectral data spectrum channel feature;Then, divergence square in class is calculated separately Battle array and class scatter matrix, and regularization Scatter Matrix is built using the pixel of the small spatial neighborhood of test set pixel, with maximum Degree retaining space neighborhood information;Then, with the thought of regularization linear discriminant analysis, regularization unit matrix, iteration are introduced The analytic solutions for solving transformed matrix, avoid ill-conditioning problem, and avoid excessive dimensionality reduction problem;Finally, it is carried out using feature after dimensionality reduction Classify and calculates nicety of grading.Due to the low-dimensional feature after dimensionality reduction include abundant spectrum and spatial information, data distribution not Higher classification accuracy still can be obtained in the case of known.
A kind of hyperspectral image classification method based on local auto-adaptive discriminant analysis, it is characterised in that including following step Suddenly:
Step 1:If image resolution ratio is P × Q, the spectrum channel number of each pixel is d, shares C classes, from it is every it is a kind of in The pixel of machine extraction 5% is denoted as X=[x as training set1,x2,...,xn], xiIndicate the ith pixel point in training set, I=1 ..., n, n are the pixel quantity that training set includes, and all remaining pixels constitute test set, are denoted as T=[t1, t2,...,tr], tjIndicate that j-th of pixel in training set, j=1 ..., r, r are the pixel quantity that test set includes.
Step 2:The similarity between any two pixel in training set is initialized, local auto-adaptive similarity power is obtained Weight matrix s, specially:If p-th of pixel and q-th of pixel belong to same category c, and p-th of pixel in training set The ci pixel in corresponding classification c, q-th of pixel corresponds to the cj pixel in classification c, then between the two pixels SimilarityAnd the element of the pth row q row in order matrix sOtherwise, spq=0, wherein p=1 ..., n, Q=1 ..., n, ncIt is the number of c class pixels, c=1 ..., C, ci=1 ..., nc, cj=1 ..., nc
Step 3:Structure obtains Scatter Matrix in class as the following formula respectivelyWith class scatter matrix
Wherein,Indicate the ci pixel for belonging to c classes in training set,It indicates to belong to the of c classes in training set Cj pixel, xpIndicate p-th of pixel in training set, xqIndicate q-th of pixel in training set.
Step 4:For each pixel t in test setj, j=1 ..., r 4.1 to step 4.3 are counted as follows Calculate its classification results:
Step 4.1:With tjCentered on point, take tj'sK-1 pixel in spatial neighborhood, is obtained K picture Vegetarian refreshments remembers tjFor z1 j, sequence of the remaining K-1 pixel according to spatial position from left to right, from the top down be denoted as z successivelyk j, k =2 ..., K is obtainedStructure obtains pixel t as the following formulajScatter Matrix in spatial neighborhood
Wherein,K=9 in the present invention.
Step 4.2:A solves to obtain transition matrix G to step d iteration optimizations as follows:
Step a:It is m to enable estimated dimensionality reduction dimension, and the size of m >=1, random initializtion matrix conversion matrix G, G is d × m, just One λ=100 of beginningization hyper parameter, two γ=10 of hyper parameter-3
Step b:Scatter Matrix in fixed classUpdate transition matrix G:It solvesCharacteristic value and Feature vector, I are the unit matrix that size is n × n, by the characteristic value sequence of progress from big to small, then, m characteristic value before taking Corresponding feature vector, and updated transition matrix G is obtained as a row of transition matrix G using each feature vector.
Step c:The updated transition matrix G that fixing step b is obtained updates Scatter Matrix in classFirst, according toUpdate the similar of ci pixel in c classes and the cj pixel Angle valueThen Scatter Matrix in updated class is calculated according to formula (1) in c=1 ..., C
Step d:The transition matrix G that the transition matrix G and last iteration obtained respectively with current iteration is obtained is calculatedValue, if the difference of the two be less than or equal to 10-3, then the transition matrix G that current iteration obtains is Final transition matrix G;Otherwise, return to step b;Wherein, the mark of tr () representing matrix,In the class obtained for current iteration Scatter Matrix.
Step 4.3:Utilize Y=GTThe eigenmatrix Y after training set dimensionality reduction is calculated in X, utilizes Tj'=GT×tjIt calculates Obtain test set pixel tjFeature T after dimensionality reductionj', utilize 1NN classifier calculateds Tj' the Euclidean distance with each row in Y, will The class label of the minimum corresponding training set pixel of row of Euclidean distance is as test set pixel tjClassification results predict(tj)。
Step 5:Overall accuracy OA is calculated using OA=N (TP)/r.Wherein, N (TP) indicates correctly to divide in test set The pixel number of class, the pixel correctly classified refer to that the classification results for the pixel that step 4 obtains are marked with it The equal pixel of class label.
The beneficial effects of the invention are as follows:High-spectral data spectrum channel is measured using local auto-adaptive similarity weight matrix The similarity of feature, avoid high-spectral data Gaussian distributed in traditional LDA classes algorithm it is assumed that utilizing test set simultaneously The pixel of the small spatial neighborhood of pixel builds Scatter Matrix as regular terms, retaining space neighborhood information, the low-dimensional finally acquired Feature contains abundant spectrum and spatial information, can be unknowable in data distribution during follow-up arest neighbors is classified In the case of obtain higher classification accuracy;The think of of regularization linear discriminant analysis is utilized during solving transition matrix Think, introduces regularization unit matrix, the analytic solutions of transformed matrix can be obtained to avoid ill-conditioning problem, simultaneously because class scatter The dimension of matrix full rank, transition matrix can manually be set, and no longer there are problems that excessive dimensionality reduction.
Description of the drawings
Fig. 1 is a kind of hyperspectral image classification method flow chart based on local auto-adaptive discriminant analysis of the present invention
Fig. 2 is that distinct methods are classified overall accuracy schematic diagram on Indian pine tree image data set
Specific implementation mode
Present invention will be further explained below with reference to the attached drawings and examples, and the present invention includes but are not limited to following implementations Example.
As shown in Figure 1, a kind of hyperspectral image classification method based on local auto-adaptive discriminant analysis of the present invention, is realized Steps are as follows:
1, prepare data set
If high spectrum image resolution ratio is P × Q, the spectrum channel number of each pixel is d, shares C classes, from it is every it is a kind of in The pixel of machine extraction 5% is denoted as X=[x as training set1,x2,...,xn],xiIndicate i-th in training set A pixel, i=1 ..., n,N is the quantity for the pixel that training set includes.All remaining pixels, which are constituted, to be surveyed Examination collection T=[t1,t2,...,tr], tjJ-th of pixel in expression training set, j=1 ..., r,R is test set Including pixel quantity.
2, construction local auto-adaptive similarity weight matrix s
The similarity of any two pixel in training set is initialized, and p-th of pixel corresponds to the ci picture in classification c Element, q-th of pixel correspond to the cj pixel in classification c, then the similarity between the two pixelsAnd enable square The element of pth row q row in battle array sncIt is the number of c class pixels, c=1 ..., C;If training set In p-th of pixel and q-th of pixel be not belonging to same category, then spq=0.Wherein, p=1 ..., n, q=1 ..., n, ci =1 ..., nc, cj=1 ..., nc.I.e. construction obtains local auto-adaptive similarity weight matrix as a result,S is piecemeal pair Angular moment battle array, the element value in matrix in block form are the similarities between two pixels for belonging to same class pixel.Using part The similarity of adaptive similarity weight matrix measurement high-spectral data spectrum channel feature, avoids in traditional LDA classes algorithm The hypothesis of high-spectral data Gaussian distributed.
3, Scatter Matrix in class is builtWith class scatter matrix
Given high spectrum image training set X, using the local auto-adaptive similarity weight matrix s in step 2 to Spectral Properties Sign modeling obtains following formula:
Wherein,For transition matrix, by original spectrum Feature Conversion to lower dimensional space, d is original spectrum feature Dimension, m indicate dimensionality reduction after dimension.Indicate the ci pixel for belonging to c classes in training set,It indicates in training set Belong to the cj pixel of c classes, xpIndicate p-th of pixel in training set, xqIndicate q-th of pixel in training set.Formula (4) molecular moiety corresponds to the sum of the distance after pixel dimensionality reduction in same class, after denominator part corresponds to whole sample point dimensionality reductions Sum of the distance, use for reference conventional linear discriminant analysis and minimize inter- object distance and maximize the thought of between class distance, the present invention simultaneously The ratio of sum of the distance and sum of the distance after whole pixel dimensionality reductions after pixel dimensionality reduction in same class is minimized, and utilizes part Similarity relation of the adaptive similarity weight matrix s constraint pixels in low-dimensional proper subspace,It is bigger, illustrate low In dimensional feature subspace, ci pixel of c classes is more similar to the cj pixel.The model can be sufficiently reserved spectrum letter It will be in Feature Mapping to a low-dimensional proper subspace while breath.
Scatter Matrix in classWith class scatter matrixRespectively:
4, regularization Scatter Matrix is built
For preferably retaining space neighborhood information, Scatter Matrix is built using the pixel of the small spatial neighborhood of test set pixel As regular terms.I.e. for j-th of pixel t in test setj(j=1 ..., r), point, takes it centered on it K-1 pixel in spatial neighborhood, is obtained K pixel.And by tjIt is denoted as z1 j, remaining K-1 pixel is according to space The sequence of position from left to right, from the top down is denoted as z successivelyk j, k=2 ..., K obtain Zj=[z1 j,z2 j,...,zK j].Bloom In spectrogram picture, the pixel in small space field tends to belong to same category, in order in low-dimensional proper subspace after conversion The space neighborhood information for still maintaining each test pixel models it to obtain following formula:
Wherein,For the feature after k-th of pixel dimensionality reduction in the spatial neighborhood,For institute in the spatial neighborhood There are the average characteristics after pixel dimensionality reduction,Indicate the distance between the feature after dimensionality reduction, it is empty by minimizing Between feature in neighborhood after each pixel dimensionality reduction withSum of the distance, to ensure the pixel of the same space neighborhood, It is still in low-dimensional proper subspace in close neighborhood.
Regularization Scatter MatrixFor:
Wherein, IK is the K dimensional vectors that element value is 1, i.e.,Tr () is indicated The mark of matrix.K=9 in the present invention.
5, iteration optimization solves transition matrix G
Regular terms by formula (7) as formula (4) substitutes intoAnd SZ j, abbreviation obtains following formula:
Wherein, the mark of tr () representing matrix.In order to avoid there is ill-conditioning problem, RLDA (Regularized are used for reference Linear Discriminant Analysis, regularization linear discriminant analysis) method, regularization unit matrix γ I are introduced, when Matrix when γ > 0It is inevitable reversible, to ensure that ill-conditioning problem is not present when solving transition matrix G.By It can manually be set in the dimension of class scatter matrix full rank, transition matrix, no longer there are problems that excessive dimensionality reduction.Iterative solution turns The detailed process for changing matrix G is:
1) it initializes, sets estimated dimensionality reduction dimension m (m >=1), hyper parameter λ and γ, and random initializtion transition matrixIn the present embodiment, one λ=100 of hyper parameter, two γ=10 of hyper parameter are initialized-3, m=10.
2) Scatter Matrix in fixed classUpdate transition matrix G.Solution obtainsCharacteristic value and Feature vector, and characteristic value is arranged according to descending sequence;Then, before taking the corresponding feature of m characteristic value to Amount, and the matrix of a d × m is combined into as a row of matrix G using each feature vector, i.e., updated transition matrix G;
3) fixed conversion matrix G updates Scatter Matrix in classFirst, according to Update the similarity value of ci pixel and the cj pixel in c classesThen c=1 ..., C are counted according to formula (5) Calculation obtains Scatter Matrix in updated class
4) with Scatter Matrix in the transition matrix G and class that are obtained in current iterationIt calculates's Scatter Matrix in value, then the transition matrix G obtained in the above an iteration and the class obtained in this iterationIt calculatesValue, if the difference of two calculated values be less than or equal to 10-3, then stop iteration, the step of current iteration It is rapid 2) in obtained transition matrix G be final transition matrix G.Otherwise, return to step 2), next iteration is carried out, is continued Update Scatter Matrix in transition matrix G and classWherein, the mark of tr () representing matrix.
6,1NN classifies
Utilize Y=GTX obtains the eigenmatrix after pixel dimensionality reduction in training setAnd it calculates in test set Pixel tjFeature T after dimensionality reductionj'=GT×tj, whereinUtilize 1NN classifier calculateds Tj' with Y in each row Euclidean distance, if in Y a certain row by Euclidean distance minimum, by the classification mark of the training set pixel corresponding to this row Label are used as test set pixel tjClassification results predict (tj)。
Each pixel in test set is calculated by step 4 to step 6, obtain its classification results to get To the classification results of all pixels point in test set.
Since the low-dimensional feature after dimensionality reduction can during arest neighbors is classified comprising abundant spectrum and spatial information Higher classification accuracy is obtained in the case where data distribution is unknowable.
7, overall accuracy OA (Overall Accuracy) is calculated
Overall accuracy OA is defined as follows:
OA=N (TP)/N (T) (10)
Wherein, TP indicates the set of correct classified pixels point, the correct pixel quantity of N (TP) presentation class.If surveyed Classification results predict (the t of pixel are concentrated in examinationj) the class label class (t that are marked with itj) equal, i.e. predict (tj) =class (tj), then classification is correct, otherwise classification error.T indicates that test set, N (T) indicate the quantity of pixel in test set, It is r.
The present embodiment is in central processing unitI5-34703.2GHz CPU, memory 16G, WINDOWS 10 are operated In system, tested with MATLAB softwares.The data used in experiment are Indian pine tree image:Original image by 145x145 pixel and 224 spectrum channel compositions, remove 24 spectrum channels due to water inhalation effects, there remains 200 Spectrum channel can be used for classifying.There are one marked class labels for each pixel, one share 16 kinds of classifications here.
Fig. 2 is that the method for the present invention (LADA) and other methods are classified overall accuracy on Indian pine tree image data set Schematic diagram.Control methods is respectively:Regularization LDA methods (RLDA), semi-supervised discriminant analysis method (SDA) and spectral-spatial LDA methods (SSLDA).The classification overall accuracy of the LADA methods of the present invention is 0.9175, is higher than other three kinds of methods, Qi Tafang Method classification overall accuracy is followed successively by:0.6775,0.7565,0.9048.Meanwhile optimal visibility effect shows compared to same The SSLDA methods of use space neighborhood constraint, LADA methods of the invention can pass through local auto-adaptive similarity weight matrix Constraints conversion matrix, to ensure that the low-dimensional feature obtained by transition matrix contains more preferably spectral signature information and space is adjacent Domain information, to promote the classifying quality subsequently classified using this feature.In addition, empty without utilizing with RLDA and SDA etc. Between the method that constrains of neighborhood compare, classification accuracy of the invention has greatly improved, effectively demonstrate spatial neighborhood about The importance of beam.

Claims (1)

1. a kind of hyperspectral image classification method based on local auto-adaptive discriminant analysis, it is characterised in that include the following steps:
Step 1:If image resolution ratio is P × Q, the spectrum channel number of each pixel is d, shares C classes, is taken out at random from every one kind 5% pixel is taken to be denoted as X=[x as training set1,x2,...,xn], xiIndicate the ith pixel point in training set, i= 1 ..., n, n are the pixel quantity that training set includes, and all remaining pixels constitute test set, are denoted as T=[t1,t2,..., tr], tjIndicate that j-th of pixel in training set, j=1 ..., r, r are the pixel quantity that test set includes;
Step 2:The similarity between any two pixel in training set is initialized, local auto-adaptive similarity weight square is obtained Battle array s, specially:If p-th of pixel and q-th of pixel belong to same category c in training set, and p-th of pixel corresponds to The ci pixel in classification c, q-th of pixel correspond to the cj pixel in classification c, then similar between the two pixels DegreeAnd the element of the pth row q row in order matrix sOtherwise, spq=0, wherein p=1 ..., n, q= 1 ..., n, ncIt is the number of c class pixels, c=1 ..., C, ci=1 ..., nc, cj=1 ..., nc
Step 3:Structure obtains Scatter Matrix in class as the following formula respectivelyWith class scatter matrix
Wherein,Indicate the ci pixel for belonging to c classes in training set,Indicate cj that belong to c classes in training set Pixel, xpIndicate p-th of pixel in training set, xqIndicate q-th of pixel in training set;
Step 4:For each pixel t in test setj, j=1 ..., r 4.1 to step 4.3 calculate it as follows Classification results:
Step 4.1:With tjCentered on point, take tj'sK-1 pixel in spatial neighborhood, is obtained K pixel Point remembers tjFor z1 j, sequence of the remaining K-1 pixel according to spatial position from left to right, from the top down be denoted as z successivelyk j, k= 2 ..., K is obtainedStructure obtains pixel t as the following formulajScatter Matrix in spatial neighborhood
Wherein,K=9 in the present invention;
Step 4.2:A solves to obtain transition matrix G to step d iteration optimizations as follows:
Step a:It is m to enable estimated dimensionality reduction dimension, and the size of m >=1, random initializtion matrix conversion matrix G, G is d × m, initialization One λ=100 of hyper parameter, two γ=10 of hyper parameter-3
Step b:Scatter Matrix in fixed classUpdate transition matrix G:It solvesCharacteristic value and feature Vector, I are the unit matrix that size is n × n, and by the characteristic value sequence of progress from big to small, then, m characteristic value corresponds to before taking Feature vector, and updated transition matrix G is obtained as a row of transition matrix G using each feature vector;
Step c:The updated transition matrix G that fixing step b is obtained updates Scatter Matrix in classFirst, according toUpdate the similar of ci pixel in c classes and the cj pixel Angle valueThen Scatter Matrix in updated class is calculated according to formula (1) in c=1 ..., C
Step d:The transition matrix G that the transition matrix G and last iteration obtained respectively with current iteration is obtained is calculatedValue, if the difference of the two be less than or equal to 10-3, then the transition matrix G that current iteration obtains is Final transition matrix G;Otherwise, return to step b;Wherein, the mark of tr () representing matrix,The class obtained for current iteration Interior Scatter Matrix;
Step 4.3:Utilize Y=GTThe eigenmatrix Y after training set dimensionality reduction is calculated in X, utilizes Tj'=GT×tjSurvey is calculated Examination collection pixel tjFeature T after dimensionality reductionj', utilize 1NN classifier calculateds Tj' with Y in each row Euclidean distance, by it is European away from Class label from a minimum corresponding training set pixel of row is as test set pixel tjClassification results predict (tj);
Step 5:Overall accuracy OA is calculated using OA=N (TP)/r;Wherein, correctly classify in N (TP) expressions test set Pixel number, the pixel correctly classified refer to the class that the classification results for the pixel that step 4 obtains are marked with it The equal pixel of distinguishing label.
CN201810537158.0A 2018-05-30 2018-05-30 Hyperspectral image classification method based on local auto-adaptive discriminant analysis Pending CN108805061A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810537158.0A CN108805061A (en) 2018-05-30 2018-05-30 Hyperspectral image classification method based on local auto-adaptive discriminant analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810537158.0A CN108805061A (en) 2018-05-30 2018-05-30 Hyperspectral image classification method based on local auto-adaptive discriminant analysis

Publications (1)

Publication Number Publication Date
CN108805061A true CN108805061A (en) 2018-11-13

Family

ID=64089240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810537158.0A Pending CN108805061A (en) 2018-05-30 2018-05-30 Hyperspectral image classification method based on local auto-adaptive discriminant analysis

Country Status (1)

Country Link
CN (1) CN108805061A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685105A (en) * 2018-11-16 2019-04-26 中国矿业大学 A kind of high spectrum image clustering method based on the study of unsupervised width
CN110163274A (en) * 2019-05-15 2019-08-23 南京邮电大学 A kind of object classification method based on ghost imaging and linear discriminant analysis
CN110399885A (en) * 2019-07-12 2019-11-01 武汉科技大学 A kind of image object classification method based on local geometric perception
CN111783865A (en) * 2020-06-23 2020-10-16 西北工业大学 Hyperspectral classification method based on space spectrum neighborhood embedding and optimal similarity graph
CN112836671A (en) * 2021-02-26 2021-05-25 西北工业大学 Data dimension reduction method based on maximization ratio and linear discriminant analysis
CN117079058A (en) * 2023-10-11 2023-11-17 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208034A (en) * 2011-07-16 2011-10-05 西安电子科技大学 Semi-supervised dimension reduction-based hyper-spectral image classification method
CN105023024A (en) * 2015-07-23 2015-11-04 湖北大学 Remote sensing image classification method and system based on regularization set metric learning
CN107220662A (en) * 2017-05-16 2017-09-29 西北工业大学 The hyperspectral image band selection method clustered based on global optimum

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102208034A (en) * 2011-07-16 2011-10-05 西安电子科技大学 Semi-supervised dimension reduction-based hyper-spectral image classification method
CN105023024A (en) * 2015-07-23 2015-11-04 湖北大学 Remote sensing image classification method and system based on regularization set metric learning
CN107220662A (en) * 2017-05-16 2017-09-29 西北工业大学 The hyperspectral image band selection method clustered based on global optimum

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
QI WANG ET.AL: "Locality Adaptive Discriminant Analysis for Spectral–Spatial Classification of Hyperspectral Images", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685105A (en) * 2018-11-16 2019-04-26 中国矿业大学 A kind of high spectrum image clustering method based on the study of unsupervised width
CN110163274A (en) * 2019-05-15 2019-08-23 南京邮电大学 A kind of object classification method based on ghost imaging and linear discriminant analysis
CN110163274B (en) * 2019-05-15 2022-08-30 南京邮电大学 Object classification method based on ghost imaging and linear discriminant analysis
CN110399885A (en) * 2019-07-12 2019-11-01 武汉科技大学 A kind of image object classification method based on local geometric perception
CN110399885B (en) * 2019-07-12 2022-05-27 武汉科技大学 Image target classification method based on local geometric perception
CN111783865A (en) * 2020-06-23 2020-10-16 西北工业大学 Hyperspectral classification method based on space spectrum neighborhood embedding and optimal similarity graph
CN112836671A (en) * 2021-02-26 2021-05-25 西北工业大学 Data dimension reduction method based on maximization ratio and linear discriminant analysis
WO2022178978A1 (en) * 2021-02-26 2022-09-01 西北工业大学 Data dimensionality reduction method based on maximum ratio and linear discriminant analysis
CN112836671B (en) * 2021-02-26 2024-03-08 西北工业大学 Data dimension reduction method based on maximized ratio and linear discriminant analysis
CN117079058A (en) * 2023-10-11 2023-11-17 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic equipment
CN117079058B (en) * 2023-10-11 2024-01-09 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
CN108805061A (en) Hyperspectral image classification method based on local auto-adaptive discriminant analysis
CN110414377B (en) Remote sensing image scene classification method based on scale attention network
CN110717354B (en) Super-pixel classification method based on semi-supervised K-SVD and multi-scale sparse representation
CN109344736B (en) Static image crowd counting method based on joint learning
CN104484681B (en) Hyperspectral Remote Sensing Imagery Classification method based on spatial information and integrated study
Van der Maaten et al. Visualizing data using t-SNE.
CN110135267A (en) A kind of subtle object detection method of large scene SAR image
CN103886342B (en) Hyperspectral image classification method based on spectrums and neighbourhood information dictionary learning
CN103489005B (en) A kind of Classification of High Resolution Satellite Images method based on multiple Classifiers Combination
CN109523520A (en) A kind of chromosome automatic counting method based on deep learning
CN110309868A (en) In conjunction with the hyperspectral image classification method of unsupervised learning
CN105718942B (en) High spectrum image imbalance classification method based on average drifting and over-sampling
CN108171122A (en) The sorting technique of high-spectrum remote sensing based on full convolutional network
CN110287873A (en) Noncooperative target pose measuring method, system and terminal device based on deep neural network
CN104281835B (en) Face recognition method based on local sensitive kernel sparse representation
CN108447057A (en) SAR image change detection based on conspicuousness and depth convolutional network
CN103955709B (en) Weighted synthetic kernel and triple markov field (TMF) based polarimetric synthetic aperture radar (SAR) image classification method
CN110414616B (en) Remote sensing image dictionary learning and classifying method utilizing spatial relationship
CN111639697B (en) Hyperspectral image classification method based on non-repeated sampling and prototype network
He et al. Object-oriented mangrove species classification using hyperspectral data and 3-D Siamese residual network
CN114998220A (en) Tongue image detection and positioning method based on improved Tiny-YOLO v4 natural environment
CN112115806B (en) Remote sensing image scene accurate classification method based on Dual-ResNet small sample learning
CN111784777A (en) SMT material quantity statistical method and statistical system based on convolutional neural network
CN114926693A (en) SAR image small sample identification method and device based on weighted distance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181113

WD01 Invention patent application deemed withdrawn after publication