CN104408478B - A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering - Google Patents

A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering Download PDF

Info

Publication number
CN104408478B
CN104408478B CN201410647211.4A CN201410647211A CN104408478B CN 104408478 B CN104408478 B CN 104408478B CN 201410647211 A CN201410647211 A CN 201410647211A CN 104408478 B CN104408478 B CN 104408478B
Authority
CN
China
Prior art keywords
layer
feature
sample
dictionary
sparse coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410647211.4A
Other languages
Chinese (zh)
Other versions
CN104408478A (en
Inventor
张向荣
焦李成
梁云龙
马文萍
侯彪
刘若辰
马晶晶
白静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410647211.4A priority Critical patent/CN104408478B/en
Publication of CN104408478A publication Critical patent/CN104408478A/en
Application granted granted Critical
Publication of CN104408478B publication Critical patent/CN104408478B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/245Classification techniques relating to the decision surface

Abstract

The invention particularly discloses a kind of based on the sparse hyperspectral image classification method for differentiating feature learning is layered, it is mainly used in solving the problem of prior art can not learn the character representation of high-spectral data neighborhood block well.Implementation step is:Hyperspectral image data sample set is inputted, training set and test set is therefrom selected;Based on the training set and sample set selected, feature learning method is differentiated using the layering based on sparse coding, first layer is obtained and differentiates that feature and the second layer differentiate feature;First layer is differentiated that feature and the second layer differentiate that feature is combined, layering is obtained and differentiates feature;Feature is differentiated based on layering, classified using supporting vector machine, output category result.The present invention is on the basis of spatial pyramid sparse coding model, the differentiation dictionary learning of category supervision message is added, and the identification of feature is enhanced using two layers of differentiation feature learning based on spatial pyramid sparse model, improve nicety of grading so that more accurate to Hyperspectral data classification.

Description

A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering
Technical field
The invention belongs to technical field of image processing, it is related to machine learning and Hyperspectral imagery processing, specifically a kind of base In being layered the sparse hyperspectral image classification method for differentiating feature learning, the present invention can be by differentiating to high-spectral data Feature learning, appropriate symbolize the feature of the different atural objects of high spectrum image, so as to realize that computer is autonomous on this basis Atural objects different for high spectrum image carry out Classification and Identification.
Background technology
The terrain classification of high spectrum image is the study hotspot in current Hyperspectral imagery processing field, and its research is mainly endeavoured Learn and recognize the technical method of different images target with making computer intelligence in searching.High spectrum image has relatively higher light Spectral resolution, generally reaches 10-2λThe order of magnitude, while wave band is more, more than spectrum channel number is up to tens of or even hundreds of, and And each interchannel is often continuous.The terrain classification of high spectrum image is in geologic survey, crops disaster monitoring, atmosphere pollution With military target strike etc. field have applications well prospect.High spectrum image terrain classification method the most universal is typically: (1) a panel height spectrum picture is inputted;(2) training sample and test sample are therefrom chosen;(3) distinguished by the method for feature learning To training sample and test sample learning characteristic;(4) feature learned is classified by grader;(5) classification knot is obtained Really.One of key issue is how to extract useful information from a large amount of high-spectral datas with redundancy, uses conjunction Suitable feature learning method symbolizes the expression of different atural objects, because in the whether reasonable performance for determining subsequent classification represented Limit.Further, since EO-1 hyperion has the unfavorable factors such as data volume is big, redundancy is more, wave band is more, therefore it is required that to EO-1 hyperion number The technical method used during according to feature learning is efficient, simple and has certain anti-noise jamming ability.
Jianchao Yang et al. are in paper " Linear Spatial Pyramid Matching Using Sparse Using the method based on Sparse Coding to original in Coding for Image Classification " (CVPR, 2009) Beginning hyperspectral image data carries out the maximum pond feature coding of spatial pyramid, and last combining classification device is classified.This method Concretely comprise the following steps the 1st step:Extract sample SIFT feature;2nd step:Training dictionary;3rd step:SIFT feature is entered according to dictionary Row coding obtains sparse coding vector, and the final feature that maximum pond algorithm obtains each sample is done to sparse coding vector;4th Step:Final feature is classified with linear support vector machine method.Although this method is relatively accurate to feature coding, Be, however it remains weak point be that this method compares the quality dependent on sparse coding.
The content of the invention
The present invention is directed to above-mentioned the deficiencies in the prior art, proposes that a kind of new layering based on sparse coding differentiates characterology Learning method, adds category information, during layered characteristic learns when carrying out sparse coding to hyperspectral image data Structural information is added so that terrain classification feature has more identification, so as to further improve different to high-spectral data image The Intelligent Recognition ability of atural object.
The technical scheme is that:A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering, Comprise the following steps:
(1) input includes the high-spectrum remote sensing data of C class atural objects, and each pixel is sample, mixes the sample with spectrum Characteristic vector represents that the intrinsic dimensionality of sample is h, and all samples constitute sample setWherein yiFor i-th Sample, N is sample total number, and R represents real number field;
(2) random 10% sample selected from every class sample set is as training setn1Represent training set Number of samples, remaining 90% sample is used as test setn2Represent test set number of samples;
(3) it is based on training set YtrainWith sample set Y, differentiate feature learning method using the layering based on sparse coding, obtain Differentiate feature set to first layerAnd the second layer differentiates feature setWherein,For corresponding to sample set Y The first layer of i-th of sample differentiates feature,To differentiate feature corresponding to the second layer of i-th of sample of sample set Y:
K 3a) is randomly selected from training set1Individual training sample differentiates the initialization dictionary of dictionary as first layerUsing K-SVD dictionary learning methods are differentiated, obtain first layer and differentiate dictionary D;
Dictionary D 3b) is differentiated based on first layer, the first layer for obtaining all samples using orthogonal matching pursuit algorithm is sparse to be compiled Code feature
3c) according to the first layer sparse coding feature of all samples, feature learning method is differentiated using first layer, the is obtained One layer of differentiation feature setAnd second layer input feature vector collection
3d) concentrated from the corresponding second layer input feature vector of training set and randomly select K2It is individual to differentiate the first of dictionary as the second layer Beginningization dictionary D2', with reference to corresponding category matrix and discrimination matrix, differentiate that the optimization of dictionary learning method differentiates similar to first layer Dictionary object function, obtains the second layer and differentiates dictionary
3e) second layer input feature vector collection and the second layer based on sample set Y differentiate dictionary, utilize orthogonal matching pursuit algorithm Obtain the second layer sparse coding feature of each sampleTo the of all samples The maximum pond algorithm of two layers of sparse coding characteristic use, obtains the second layer and differentiates feature set
(4) merge first layer and differentiate feature setDifferentiate feature set with the second layerThe layering for obtaining sample set Y differentiates special Collect F,
(5) the corresponding layering of training set and test set is differentiated that feature set is input to supporting vector machine, obtains test set Tag along sort vector, such label vector is the classification results of the high spectrum image.
Above-mentioned steps 3a) in differentiate that K-SVD dictionary learning methods are concretely comprised the following steps:
1st step, based on training set Ytrain, differentiate that the object function of K-SVD dictionary learning methods is as follows:
Wherein, above-mentioned formula Section 1 is reconstruct error term, and Section 2 is differentiates sparse coding bound term, and Section 3 is to divide Class error term, D represents that first layer differentiates dictionary, includes K1Individual dictionary atom, each atom dimension is d, W presentation classes conversion square Battle array, A represents the matrix of a linear transformation, and X represents sparse coding coefficient matrix,Represent l2The quadratic sum of norm, α and β represent balance Category differentiates the regular parameter of item and error in classification, and span is 1~5,Represent differentiation ideally Sparse coding coefficient matrix, if k-th of dictionary atom and training sample set Y in DtrainIn i-th of sample when belonging to same class, then QkiIt is worth for 1, is 0 during inhomogeneity,The category matrix of training sample is represented, if YtrainIn i-th of sample belong to c (c=1,2 ..., C) class, HciIt is otherwise 0, x for 1iSparse coding coefficient matrix X the i-th column vector is represented, | | | |1Represent l1 Norm, ε is the 10- of definition6
2nd step, in order to solve the object function for differentiating K-SVD dictionary learning methods, is rewritten as:
Wherein,(·)TRepresent square The transposition of battle array, is solved to the object function using K-SVD dictionary learning methods, so that obtaining first layer differentiates dictionary D.
Above-mentioned steps 3b) in orthogonal matching pursuit algorithm concretely comprise the following steps:
1st step, dictionary D is differentiated based on first layer, and the objective optimization function of orthogonal matching pursuit algorithm is as follows:
Wherein, yiRepresent sample set Y i-th of sample, ziRepresent yiSparse coding coefficient, δ for definition 10-6
2nd step, construction residual error, residual error is configured to r(0)=yi, i=1,2 ... N, indexed set Λ0Null vector is tieed up for K, just Beginningization variable J=1;
3rd step, finds out residual error r(J-1)With the jth row d in dictionary DjSubscript λ corresponding to inner product maximum, i.e.,
4th step, updates indexed set Λ(J), Λ(J)(J)=λ;Update the set D that selected dictionary atom row are constituted(J)= D(:,Λ(J)(1:J)), obtain what J ranks were approached with least square methodNew residual error r(J)= yi-D(J)zi, J=J+1;
5th step, judges whether that iteration terminates:If J≤K and still having yiNot as residual error, then the 2nd step is returned to, otherwise, If J≤K and yi, i=1,2 ... N as residual error then EP (end of program), if J > K, return to the 3rd step and continue executing with.
Above-mentioned steps 3c) in first layer differentiate that feature learning method is concretely comprised the following steps:
1st step, with the sparse coding feature z of each samplei, i=1 centered on 2 ..., N, takes the neighborhood window size to be The sparse coding feature of all samples constitutes sparse coding block Z in (2m+1) × (2m+1)i, i=1,2 ..., N, ZiFor (2m+1) ×(2m+1)×K1A three-dimensional matrice;
2nd step, to the sparse coding block Z of each sampleiPiecemeal is carried out, (m+1) × (m+1) sliding window is utilized, drawn Window step-length is m, from top to bottom, from left to right travels through Zi, sparse coding is extracted successively represents sub-block Zi (1)、Zi (2)、Zi (3)And Zi (4), 4 sub-blocks altogether, the scale of each sub-block is (m+1) × (m+1) × K1
4 obtained sub-blocks are carried out the maximum pond algorithm of spatial pyramid by the 3rd step successively
Wherein, SM () represents to carry out the maximum pondization operation of spatial pyramid,U Represent spatial pyramid Decomposition order, VuIt is all pieces of the total number positioned at u layers of spatial pyramid, M () represents maximum Pond algorithm,
4th step, in the way of row matrix is combinedObtain i-th The first layer of individual sample differentiates featureIn the way of rectangular array is combined Obtain the second layer input feature vector of i-th of sample.
Beneficial effects of the present invention:Present invention input hyperspectral image data, utilizes the training of the part randomly selected Sample is as one layer of initial differentiation dictionary, by differentiating that dictionary learning obtains one layer of differentiation dictionary, according to one layer of obtained differentiation The sparse coding that dictionary solves the neighborhood block of each high-spectral data represents coefficient, by the maximum pond method of pyramid, obtains To two layers of initial differentiation dictionary and one layer of coding characteristic, two layers of initial differentiation dictionary are recycled to be obtained by differentiation dictionary learning algorithm To two layers of differentiation dictionary, differentiate that dictionary solves the sparse coding of second layer feature coding corresponding region block according to two layers obtained Coefficient is represented, by the maximum pond method of pyramid, two layers of coding characteristic are obtained, by one layer of coding characteristic with two layers of coding characteristic It is combined, as the feature for finally learning to obtain, this characteristic use grader is classified, so that with reaching EO-1 hyperion The purpose of thing classification, and achieve higher terrain classification precision.The present invention compared with prior art, with advantages below:
First, the present invention is using the method for differentiating dictionary learning, in first layer dictionary learning and second layer dictionary learning, Category information is considered, the deficiency that traditional K-SVD dictionary learnings do not make full use of category information is overcome so that the present invention Learn obtained dictionary and the sparse coding coefficient that is obtained by the dictionary learning has more the advantage of identification.
Second, the present invention is overcome tradition and is used individual layer sparse coding using the method for multilayer sparse coding feature learning Coefficient is directly classified and the relatively low shortcoming of nicety of grading so that the present invention has the advantages that nicety of grading is high.
3rd, the feature learning method that the present invention is combined using empty spectral domain overcomes and carries out characterology with a pixel The deficiency for not accounting for surrounding neighbors information of the algorithm of habit so that the present invention has the feature robustness obtained to study more preferable Advantage.
The present invention is described in further details below with reference to accompanying drawing.
Brief description of the drawings
Fig. 1 is the flow chart of the inventive method;
The image that Fig. 2 is Indianan Pine in emulation experiment of the present invention.
Embodiment
Invention is described further below in conjunction with the accompanying drawings.
It is described as follows with reference to 1 pair of specific steps of the invention of accompanying drawing:
Step 1, input includes the high-spectrum remote sensing data of C class atural objects, each pixel be sample spectral signature to Amount represents that the intrinsic dimensionality of sample is h, and all samples constitute sample setWherein yiFor i-th of sample, N For sample total number, R represents real number field;
Step 2, in this N number of sample, background sample point is got rid of, 10% sample is selected from every class sample set at random It is used as training setn1Training set number of samples is represented, remaining 90% sample is used as test setn2Represent test set number of samples;
Step 3, based on training set and sample set, differentiate feature learning method using the layering based on sparse coding, obtain First layer differentiates feature setAnd the second layer differentiates feature setWherein,For corresponding to sample set i-th The first layer of individual sample differentiates feature,To differentiate feature corresponding to i-th of second layer of i-th of sample of sample set:
The first step, a part is randomly choosed from all kinds of training sets, K is chosen altogether1Individual training sample is sentenced as first layer The initialization dictionary of malapropism allusion quotationUsing K-SVD dictionary learning methods are differentiated, obtain first layer and differentiate dictionary D, sentence The object function of other K-SVD dictionary learning methods is as follows:
Wherein, Section 1 is reconstruct error term, and Section 2 is differentiates sparse coding bound term, and Section 3 is error in classification , D represents that first layer differentiates dictionary, includes K1Individual dictionary atom, each atom dimension is d, W presentation class transformation matrixs, A tables Linear transformation matrix, X represents sparse coding coefficient matrix,Represent l2The quadratic sum of norm, α and β represent that balance category is sentenced The regular parameter of other item and error in classification, span is 1~5,Represent the sparse volume of differentiation ideally Code coefficient matrix, if k-th of dictionary atom and training sample set Y in DtrainIn i-th of sample when belonging to same class, then QkiValue It is 0 for 1, during inhomogeneity,The category matrix of training sample is represented, if YtrainIn i-th of sample belong to c (c= 1,2 ..., C) class, HciIt is otherwise 0, x for 1iSparse coding coefficient matrix X the i-th column vector is represented, | | | |1Represent l1Model Number, ε is the 10 of definition-6
In order to solve the object function for differentiating K-SVD dictionary learning methods, it is rewritten as:
Wherein,(·)TRepresent square The transposition of battle array, is solved to the object function using K-SVD dictionary learning methods, so that obtaining first layer differentiates dictionary D;
Wherein, djRepresent DnewJth row atom,X jth row is represented, L represents DnewTotal columns, dkRepresent Dnew's Kth row atom,Represent X row k, EkRepresent without using DnewKth row atom dkCarry out the error produced by Its Sparse Decomposition Matrix;
Wherein K-SVD dictionary learnings method is as follows:
1. pair differentiation dictionary object functionDeformed, i.e., by YnewUse vectorial shape Formula EkRepresent, by DnewUse vector form dkRepresent, X is used into vector formRepresent, to the formula of gained after deformationIt is multiplied by Ωk, obtain goal decomposition formula:
Wherein distortion inaccuracy matrixRepresent error matrix EkDeformation,ΩkSize be P ×|ωk|, P represents training sample set YnewColumns,k| represent ωkModulus value, and ΩkIn (ωk(j), j) place is 1, and 0 is all elsewhere, wherein 1≤j≤| ωk|, ωk(j) ω is representedkJ-th number;
2. pair gained goal decomposition formulaIn distortion inaccuracy matrixSVD decomposition is carried out to obtainWherein U represents left singular matrix, VTRight singular matrix is represented, Δ represents singular value matrix;
3. remove more fresh target training dictionary D with gained left singular matrix U first rownewKth row atom dk
4. repeat step 1 is to step 3 to DnewIn all atoms be updated processing, obtain the new dictionary D of K1′,D2′… DK′。
Second step, dictionary D is differentiated based on first layer, is solved following object function using orthogonal matching pursuit algorithm, is obtained The first layer coding characteristic of all samples
Wherein, yiRepresent sample set Y i-th of sample, ziRepresent yiSparse coding coefficient, δ for definition 10-6, just Hand over matching pursuit algorithm as follows:
Residual error is constructed first, and residual error is configured to r(0)=yi, indexed set Λ0Null vector, initializing variable J=are tieed up for K 1;
Then circulation performs following steps 1-5
1. find out residual error r(J-1)With the jth row d in dictionary DjSubscript λ corresponding to inner product maximum, i.e.,
2. update indexed set Λ(J), Λ(J)(J)=λ.Update the set D that selected dictionary atom row are constituted(J)=D (:, Λ(J)(1:J));
3. obtain what J ranks were approached using least square method
4. update residual error r(J)=yi-D(J)zi, J=J+1;
5. judge whether that iteration terminates.If J > K, terminate, otherwise continue 1.
3rd step, according to the first layer sparse coding feature of all samples, differentiates feature learning method using first layer, obtains Differentiate feature set to first layerAnd second layer input feature vector collection Wherein first layer differentiates that feature learning method is as follows:
1. with the sparse coding feature z of each sampleiCentered on, it is (2m+1) × (2m+1), m=to take neighborhood window size 1,2 ..., the sparse coding feature construction of each sample is represented into block Z as sparse codingi, i=1, the rule of 2 ..., N, i.e., one Mould is (2m+1) × (2m+1) × K1Three-dimensional matrice;
2. the sparse coding of pair each sample represents block ZiPiecemeal is carried out, (m+1) × (m+1) sliding window is utilized, drawn Window step-length is m, and from top to bottom, the sparse coding for from left to right traveling through each sample represents block, and sparse coding is extracted successively and is represented Sub-block Zi (1)、Zi (2)、Zi (3)And Zi (4), 4 sub-blocks altogether, the scale of each sub-block is (m+1) × (m+1) × K1
3. successively 4 obtained sub-blocks are carried out with the maximum pond algorithm of spatial pyramid
Wherein, SM () represents to carry out the maximum pondization operation of spatial pyramid,U Represent spatial pyramid Decomposition order, VuIt is all pieces of the total number positioned at u layers of spatial pyramid, M () represents maximum Pond algorithm,
4. in the way of row matrix is combinedObtain i-th of sample First layer differentiate featureIn the way of rectangular array is combined To the second layer input feature vector of i-th of sample.
4th step, the input feature vector for obtaining the second layer from training set is concentrated selects a part at random, and K is chosen altogether2It is individual to make The initialization dictionary D of dictionary is differentiated for the second layer2', with reference to corresponding category matrix and discrimination matrix, differentiate similar to first layer Dictionary construction method is by differentiating that dictionary object function can obtain two layers of differentiation dictionary
5th step, layer 2-based input feature vector and the second layer differentiate dictionary, are obtained using orthogonal matching pursuit algorithm The second layer sparse coding feature of each sampleWhereinCorresponding to i-th The input feature vector of the second layerObtained jth row second layer sparse coding feature, it is special to the second layer sparse coding of all samples Levy using maximum pond algorithm, obtain the second layer and differentiate feature set
Step 4, the first layer of all samples is differentiated into feature setAnd the second layer differentiates feature setWith reference to being layered Differentiate feature set F
Step 5, the corresponding layering of training set and test set is differentiated that feature set is input to supporting vector machine, obtains test set Tag along sort vector, such label vector is the classification results of the high spectrum image.
2 pairs of effects of the invention are described further below in conjunction with the accompanying drawings.
The emulation of the present invention is to be printed in representative NASA NASA AVIRIS in June, 1992 Being carried out on the high spectrum image Indiana Pine that the peace Nahsi the north is obtained, Indiana Pine images size is 145 × Comprising 220 wave bands in 145 pixels, image, remaining 200 wave bands of 20 wave bands absorbed by waters are removed, the image is wrapped altogether Containing 16 class atural objects as shown in table 1.
The emulation experiment of the present invention is that, in AMDA4-3400APU, dominant frequency 2.69GHz, internal memory 4G, Windows 7 32 is put down Realized on MATLAB 2011a on platform.
16 class data in the Indiana Pine images of table 1
2. emulation content and analysis
High spectrum image is classified with existing three kinds of methods using of the invention, existing three kinds of methods are respectively:Support Vector machine SVM, the sorting technique SRC based on rarefaction representation, the spatial pyramid matching sorting technique based on rarefaction representation SCSPM, wherein SVM methods penalty factorNuclear parameterDetermined by 5 times of cross validations, the regular terms parameter lambda of SRC methods The Sparse parameter for being set to 0.1, SRC methods and SCSPM methods is set to 20, SCSPM methods and the spatial domain yardstick ginseng of the present invention Number is set to 7 × 7, takes per class 10% pixel at random from 16 class data as training sample, and remaining is 90% as surveying Examination, carries out 5 experiments and is averaged, then the experimental precision of three kinds of methods experiment precision and this method is as shown in the table:
2 existing three kinds of methods of table and experimental precision result of the present invention
From table 2 it can be seen that the method for the present invention shows what optimal, of the invention methodology acquistion was arrived in nicety of grading The nicety of grading obtained precision of classifying to initial data more direct than SVM that feature is obtained by SVM classifier is high, illustrates the present invention The feature for learning to obtain is more suitable for SVM classifier, reflects that the feature that study is obtained is effective from side;The method of the present invention is passed through The feature that the aspect ratio SCSPM study that two layers of dictionary learning and sparse coding are obtained is obtained is more efficient, is more suitable for svm classifier Device, so as to illustrate that the present invention has obvious advantage compared with the existing methods.
To sum up, layering of the present invention based on sparse coding differentiates that feature learning method carries out classification hyperspectral imagery, fully Using the sparse characteristic and spatial domain contextual information of high spectrum image, original high spectrum image can more accurately be classified, After contrast with existing three kinds of image classification methods, the accuracy and validity of the present invention is illustrated.Compared with prior art, have Have the advantage that:
First, the present invention is using the method for differentiating dictionary learning, in first layer dictionary learning and second layer dictionary learning, Category information is considered, the deficiency that traditional KSVD dictionary learnings do not make full use of category information is overcome so that the present invention Learn obtained dictionary and the sparse coding coefficient that is obtained by the dictionary learning has more the advantage of identification.
Second, the present invention is overcome tradition and is used individual layer sparse coding using the method for multilayer sparse coding feature learning Coefficient is directly classified and the relatively low shortcoming of nicety of grading so that the present invention has the advantages that nicety of grading is high.
3rd, the feature learning method that the present invention is combined using empty spectral domain overcomes and carries out characterology with a pixel The deficiency for not accounting for surrounding neighbors information of the algorithm of habit so that the present invention has the feature robustness obtained to study more preferable Advantage.
There is no the part described in detail to belong to the known conventional means of the industry in present embodiment, do not chat one by one here State.It is exemplified as above be only to the present invention for example, do not constitute the limitation to protection scope of the present invention, it is every with this The same or analogous design of invention is belonged within protection scope of the present invention.

Claims (4)

1. it is a kind of based on the hyperspectral image classification method for being layered sparse differentiation feature learning, it is characterised in that including following step Suddenly:
(1) input includes the high-spectrum remote sensing data of C class atural objects, and each pixel is sample, mixes the sample with spectral signature Vector representation, the intrinsic dimensionality of sample is h, and all samples constitute sample setWherein yiFor i-th of sample This, N is sample total number, and R represents real number field;
(2) random 10% sample selected from every class sample set is as training setn1Represent training set sample Number, remaining 90% sample is used as test setn2Represent test set number of samples;
(3) it is based on training set YtrainWith sample set Y, feature learning method is differentiated using the layering based on sparse coding, the is obtained One layer of differentiation feature setAnd the second layer differentiates feature setWherein,For corresponding to sample set Y i-th The first layer of individual sample differentiates feature,To differentiate feature corresponding to the second layer of i-th of sample of sample set Y:
K 3a) is randomly selected from training set1Individual training sample differentiates the initialization dictionary of dictionary as first layer Using K-SVD dictionary learning methods are differentiated, obtain first layer and differentiate dictionary D;
Dictionary D 3b) is differentiated based on first layer, the first layer sparse coding for obtaining all samples using orthogonal matching pursuit algorithm is special Levy
3c) according to the first layer sparse coding feature of all samples, differentiate feature learning method using first layer, obtain first layer Differentiate feature setAnd second layer input feature vector collection
3d) concentrated from the corresponding second layer input feature vector of training set and randomly select K2The individual initialization that dictionary is differentiated as the second layer Dictionary D2', with reference to corresponding category matrix and discrimination matrix, differentiate that the optimization of dictionary learning method differentiates dictionary similar to first layer Object function, obtains the second layer and differentiates dictionary
3e) second layer input feature vector collection and the second layer based on sample set Y differentiate dictionary, are obtained using orthogonal matching pursuit algorithm The second layer sparse coding feature of each sampleTo the second layer of all samples Sparse coding characteristic use maximum pond algorithm, obtains the second layer and differentiates feature setWhereinCorresponding to the input feature vector of i-th of second layerObtained jth row second layer sparse coding feature;
(4) merge first layer and differentiate feature setDifferentiate feature set with the second layerThe layering for obtaining sample set Y differentiates feature set F,
(5) the corresponding layering of training set and test set is differentiated that feature set is input to supporting vector machine, obtains the classification of test set Label vector, such label vector is the classification results of the high spectrum image.
2. a kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering according to claim 1, its Be characterised by, the step 3a) in differentiate that K-SVD dictionary learning methods are concretely comprised the following steps:
1st step, based on training set Ytrain, differentiate that the object function of K-SVD dictionary learning methods is as follows:
arg m i n D , W , A , X | | Y t r a i n - D X | | 2 2 + α | | Q - A X | | 2 2 + β | | H - W X | | 2 2
s . t . ∀ i , | | x i | | 1 ≤ ϵ
Wherein, above-mentioned formula Section 1 is reconstruct error term, and Section 2 is differentiates sparse coding bound term, and Section 3 is missed for classification Poor item, D represents that first layer differentiates dictionary, includes K1Individual dictionary atom, each atom dimension is d, W presentation class transformation matrixs, A The matrix of a linear transformation is represented, X represents sparse coding coefficient matrix,Represent l2The quadratic sum of norm, α and β represent to balance category Differentiate the regular parameter of item and error in classification, span is 1~5,Represent that differentiation ideally is sparse Code coefficient matrix, if k-th of dictionary atom and training sample set Y in DtrainIn i-th of sample when belonging to same class, then Qki It is worth for 1, is 0 during inhomogeneity,The category matrix of training sample is represented, if YtrainIn i-th of sample belong to c (c =1,2 ..., C) class, HciIt is otherwise 0, x for 1iSparse coding coefficient matrix X the i-th column vector is represented, | | | |1Represent l1Model Number, ε is the 10 of definition-6
2nd step, in order to solve the object function for differentiating K-SVD dictionary learning methods, is rewritten as:
arg m i n D n e w , X { | | Y n e w - D n e w X | | 2 2 }
s . t . ∀ i , | | x i | | 1 ≤ ϵ
Wherein,(·)TRepresenting matrix Transposition, is solved to the object function using K-SVD dictionary learning methods, so that obtaining first layer differentiates dictionary D.
3. a kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering according to claim 1, its Be characterised by, the step 3b) in orthogonal matching pursuit algorithm concretely comprise the following steps:
1st step, dictionary D is differentiated based on first layer, and the objective optimization function of orthogonal matching pursuit algorithm is as follows:
m i n z i | | y i - Dz i | | 2 2 s . t . | | z i | | 1 ≤ δ , i = 1 , 2 , ... , N
Wherein, yiRepresent sample set Y i-th of sample, ziRepresent yiSparse coding coefficient, δ for definition 10-6
2nd step, construction residual error, residual error is configured to r(0)=yi, i=1,2 ... N, indexed set Λ0Null vector, initialization are tieed up for K Variable J=1;
3rd step, finds out residual error r(J-1)With the jth row d in dictionary DjSubscript λ corresponding to inner product maximum, i.e.,
4th step, updates indexed set Λ(J), Λ(J)(J)=λ;Update the set D that selected dictionary atom row are constituted(J)=D (:, Λ(J)(1:J)), obtain what J ranks were approached with least square methodNew residual error r(J)=yi-D(J) zi, J=J+1;
5th step, judges whether that iteration terminates:If J≤K and still having yiNot as residual error, then the 2nd step is returned to, otherwise, if J≤ K and yi, i=1,2 ... N as residual error then EP (end of program), if J > K, return to the 3rd step and continue executing with.
4. a kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering according to claim 1, its Be characterised by, the step 3c) in first layer differentiate that feature learning method is concretely comprised the following steps:
1st step, with the sparse coding feature z of each samplei, i=1, centered on 2 ..., N, it is (2m+1) to take neighborhood window size The sparse coding feature of all samples constitutes sparse coding block Z in × (2m+1)i, i=1,2 ..., N, ZiFor (2m+1) × (2m+ 1)×K1A three-dimensional matrice;
2nd step, to the sparse coding block Z of each sampleiPiecemeal is carried out, (m+1) × (m+1) sliding window is utilized, window step-length is drawn For m, from top to bottom, Z is from left to right traveled throughi, sparse coding is extracted successively represents sub-block Zi (1)、Zi (2)、Zi (3)And Zi (4), altogether 4 Individual sub-block, the scale of each sub-block is (m+1) × (m+1) × K1
4 obtained sub-blocks are carried out the maximum pond algorithm of spatial pyramid by the 3rd step successively
S M ( Z i ( j ) ) = [ M ( ( Z i ( j ) ) 1 1 ) , ... , M ( ( Z i ( j ) ) 1 V 1 ) , ... , M ( ( Z i ( j ) ) u V u ) , ... , M ( ( Z i ( j ) ) U V U ) ] , j = 1 , 2 , 3 , 4
Wherein, SM () represents to carry out the maximum pondization operation of spatial pyramid,U is represented Spatial pyramid Decomposition order, VuIt is all pieces of the total number positioned at u layers of spatial pyramid, M () represents maximum pond Algorithm,
4th step, in the way of row matrix is combinedObtain i-th of sample This first layer differentiates featureIn the way of rectangular array is combined Obtain the second layer input feature vector of i-th of sample.
CN201410647211.4A 2014-11-14 2014-11-14 A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering Active CN104408478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410647211.4A CN104408478B (en) 2014-11-14 2014-11-14 A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410647211.4A CN104408478B (en) 2014-11-14 2014-11-14 A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering

Publications (2)

Publication Number Publication Date
CN104408478A CN104408478A (en) 2015-03-11
CN104408478B true CN104408478B (en) 2017-07-25

Family

ID=52646109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410647211.4A Active CN104408478B (en) 2014-11-14 2014-11-14 A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering

Country Status (1)

Country Link
CN (1) CN104408478B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154440A1 (en) * 2015-03-24 2016-09-29 Hrl Laboratories, Llc Sparse inference modules for deep learning
CN104750875B (en) * 2015-04-23 2018-03-02 苏州大学 A kind of machine error data classification method and system
CN105160295B (en) * 2015-07-14 2019-05-17 东北大学 A kind of rapidly and efficiently face retrieval method towards extensive face database
CN105740884B (en) * 2016-01-22 2019-06-07 厦门理工学院 Hyperspectral Image Classification method based on singular value decomposition and neighborhood space information
CN106096571B (en) * 2016-06-22 2018-11-16 北京化工大学 A kind of cell sorting method based on EMD feature extraction and rarefaction representation
CN106203523B (en) * 2016-07-17 2019-03-01 西安电子科技大学 The hyperspectral image classification method of the semi-supervised algorithm fusion of decision tree is promoted based on gradient
CN106203532B (en) * 2016-07-25 2019-10-08 北京邮电大学 Across the size measurement method and apparatus of moving target based on dictionary learning and coding
CN106570509B (en) * 2016-11-04 2019-09-27 天津大学 A kind of dictionary learning and coding method for extracting digital picture feature
CN106778808B (en) * 2016-11-09 2020-09-08 天津大学 Image feature learning method based on group sparse coding
CN106557782B (en) * 2016-11-22 2021-01-29 青岛理工大学 Hyperspectral image classification method and device based on class dictionary
CN106780387B (en) * 2016-12-22 2020-06-02 武汉理工大学 SAR image denoising method
CN107203750B (en) * 2017-05-24 2020-06-26 中国科学院西安光学精密机械研究所 Hyperspectral target detection method based on combination of sparse expression and discriminant analysis
CN107358249A (en) * 2017-06-07 2017-11-17 南京师范大学 The hyperspectral image classification method of dictionary learning is differentiated based on tag compliance and Fisher
CN108133232B (en) * 2017-12-15 2021-09-17 南京航空航天大学 Radar high-resolution range profile target identification method based on statistical dictionary learning
CN107909120A (en) * 2017-12-28 2018-04-13 南京理工大学 Based on alternative label K SVD and multiple dimensioned sparse hyperspectral image classification method
CN109033980B (en) * 2018-06-29 2022-03-29 华南理工大学 Hyperspectral image Gabor feature classification method based on incremental local residual least square method
CN109063766B (en) * 2018-07-31 2021-11-30 湘潭大学 Image classification method based on discriminant prediction sparse decomposition model
CN109583380B (en) * 2018-11-30 2020-01-17 广东工业大学 Hyperspectral classification method based on attention-constrained non-negative matrix factorization
CN110009032B (en) * 2019-03-29 2022-04-26 江西理工大学 Hyperspectral imaging-based assembly classification method
CN110110789A (en) * 2019-05-08 2019-08-09 杭州麦迪特检测技术服务有限公司 A kind of Chinese herbal medicine quality discrimination method based on multispectral figure information fusion technology
CN110287818B (en) * 2019-06-05 2024-01-16 广州市森锐科技股份有限公司 Hierarchical vectorization-based face feature vector optimization method
CN111709442A (en) * 2020-05-07 2020-09-25 北京工业大学 Multilayer dictionary learning method for image classification task
CN112115972B (en) * 2020-08-14 2022-11-22 河南大学 Depth separable convolution hyperspectral image classification method based on residual connection
CN112241768A (en) * 2020-11-25 2021-01-19 广东技术师范大学 Fine image classification method based on deep decomposition dictionary learning
CN112614053B (en) * 2020-12-25 2021-08-24 哈尔滨市科佳通用机电股份有限公司 Method and system for generating multiple images based on single image of antagonistic neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102891999A (en) * 2012-09-26 2013-01-23 南昌大学 Combined image compression/encryption method based on compressed sensing
US8374442B2 (en) * 2008-11-19 2013-02-12 Nec Laboratories America, Inc. Linear spatial pyramid matching using sparse coding
CN103065160A (en) * 2013-01-23 2013-04-24 西安电子科技大学 Hyperspectral image classification method based on local cooperative expression and neighbourhood information constraint
US8467610B2 (en) * 2010-10-20 2013-06-18 Eastman Kodak Company Video summarization using sparse basis function combination

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8374442B2 (en) * 2008-11-19 2013-02-12 Nec Laboratories America, Inc. Linear spatial pyramid matching using sparse coding
US8467610B2 (en) * 2010-10-20 2013-06-18 Eastman Kodak Company Video summarization using sparse basis function combination
CN102891999A (en) * 2012-09-26 2013-01-23 南昌大学 Combined image compression/encryption method based on compressed sensing
CN103065160A (en) * 2013-01-23 2013-04-24 西安电子科技大学 Hyperspectral image classification method based on local cooperative expression and neighbourhood information constraint

Also Published As

Publication number Publication date
CN104408478A (en) 2015-03-11

Similar Documents

Publication Publication Date Title
CN104408478B (en) A kind of hyperspectral image classification method based on the sparse differentiation feature learning of layering
CN106529508B (en) Based on local and non local multiple features semanteme hyperspectral image classification method
CN106815601B (en) Hyperspectral image classification method based on recurrent neural network
CN106023065B (en) A kind of tensor type high spectrum image spectral-spatial dimension reduction method based on depth convolutional neural networks
CN104751191B (en) A kind of Hyperspectral Image Classification method of sparse adaptive semi-supervised multiple manifold study
CN102208034B (en) Semi-supervised dimension reduction-based hyper-spectral image classification method
CN104281855B (en) Hyperspectral image classification method based on multi-task low rank
CN103886342B (en) Hyperspectral image classification method based on spectrums and neighbourhood information dictionary learning
CN109766858A (en) Three-dimensional convolution neural network hyperspectral image classification method combined with bilateral filtering
CN104392251B (en) Hyperspectral image classification method based on semi-supervised dictionary learning
CN104298999B (en) EO-1 hyperion feature learning method based on recurrence autocoding
CN110309867B (en) Mixed gas identification method based on convolutional neural network
CN103065160B (en) Based on the hyperspectral image classification method that the collaborative expression in local and neighborhood information retrain
CN108090447A (en) Hyperspectral image classification method and device under double branch's deep structures
CN104200217B (en) Hyperspectrum classification method based on composite kernel function
CN103336968A (en) Hyperspectral data dimensionality reduction method based on tensor distance patch alignment
CN108960330A (en) Remote sensing images semanteme generation method based on fast area convolutional neural networks
CN103208011B (en) Based on average drifting and the hyperspectral image space-spectral domain classification method organizing sparse coding
CN105956611A (en) SAR image target identification method based on authentication non-linear dictionary learning
CN104866871B (en) Hyperspectral image classification method based on projection structure sparse coding
CN107392128A (en) The robust image recognition methods returned based on double low-rank representations and local constraint matrix
CN104182767B (en) The hyperspectral image classification method that Active Learning and neighborhood information are combined
CN104778482A (en) Hyperspectral image classifying method based on tensor semi-supervised scale cutting dimension reduction
CN108830130A (en) A kind of polarization EO-1 hyperion low-altitude reconnaissance image typical target detection method
CN103425995A (en) Hyperspectral image classification method based on area similarity low rank expression dimension reduction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant