CN104008383B - Based on manifold learning linearizing high spectrum image feature extracting method - Google Patents

Based on manifold learning linearizing high spectrum image feature extracting method Download PDF

Info

Publication number
CN104008383B
CN104008383B CN201410286545.3A CN201410286545A CN104008383B CN 104008383 B CN104008383 B CN 104008383B CN 201410286545 A CN201410286545 A CN 201410286545A CN 104008383 B CN104008383 B CN 104008383B
Authority
CN
China
Prior art keywords
matrix
manifold learning
dimensionality reduction
spectrum image
high spectrum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410286545.3A
Other languages
Chinese (zh)
Other versions
CN104008383A (en
Inventor
张淼
赖镇洲
刘攀
沈毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201410286545.3A priority Critical patent/CN104008383B/en
Publication of CN104008383A publication Critical patent/CN104008383A/en
Application granted granted Critical
Publication of CN104008383B publication Critical patent/CN104008383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

One kind is based on manifold learning linearizing high spectrum image feature extracting method, belongs to hyperspectral image data and processes and applied technical field.The present invention is directed to the deficiency of manifold learning no generalization ability it is proposed that a kind of improved manifold learning linearization technique.Methods described comprises the steps:First, preliminary dimensionality reduction result and Laplacian Matrix are calculated;2nd, Matrix division constant term matrix and coefficient matrix are built;3rd, calculate Feature Conversion matrix;4th, pass through Feature Conversion matrix calculus final dimensionality reduction result.The present invention is directed to the hypothesis of overall Linear Mapping in LPP, NPE and LLTSA linearisation manifold learning in many times invalid deficiency, the penalty term deviateing former manifold learning result is added in original cost function, and cast out the bound term in former object function, the solution of optimal characteristics transition matrix has been converted to the Solve problems of a Matrix division.The method is applied to the feature extraction of high spectrum image.

Description

Based on manifold learning linearizing high spectrum image feature extracting method
Technical field
The invention belongs to hyperspectral image data is processed and applied technical field, it is related to a kind of high spectrum image feature and extracts Method is and in particular to a kind of be based on manifold learning linearizing high spectrum image feature extracting method.
Background technology
High-spectrum seems the huge data cube of quantity of information, and each pixel corresponds to one and comprises up to a hundred wave bands Spectral line, this studies the relation between material and the curve of spectrum for people and provides possibility.But it is superfluous to there is data in high-spectral data Remaining and dimension disaster problem, people have urgent demand to go to eliminate the redundancy of this information of high-spectral data.EO-1 hyperion number According to the dependency that is mainly between high-spectral data wave band of this redundancy cause, dimensionality reduction is a kind of important pretreatment side Method, although as PCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis) Such linear dimension reduction method is realized simply, but high spectrum image has nonlinear characteristic, and manifold learning can be more preferable Ground excavates the nonlinear organization of high-spectral data, improves data analysis capabilities.Classical manifold learning has LE (Laplacian Egenmap), LLE (Locally Linear Embedding) and LTSA (Local Tangent Space Alignment) method, can be used for the feature extracting method of high spectrum image.
But classical manifold learning such as LE no generalization ability, when there being new EO-1 hyperion sample data to occur, only New sample data and original sample data are combined together and carry out overall study again and just can access new sample The dimensionality reduction result of notebook data.When the number of new sample data seems little compared to the number of original sample data Wait it is clear that so repeating the time complexity that large-scale calculating can greatly increase method.For hyperspectral classification demand, stream Shape learning method must possess generalization ability, because many times training data and test data to be sorted cannot be placed on one Rise and learnt, in this case, the feature that test data to be sorted has to be mapped to low-dimensional by extensive method is empty Between, otherwise will be unable to the feature space in low-dimensional and new high-spectral data is classified.
Although linear dimension reduction method such as PCA etc. has in process, and nonlinear effect data is not good, they but can Obtain an overall mapping function, thus possess generalization ability.Manifold learning linearisation is solved flow for this lot of scholar The problem of shape study no generalization ability.It is exactly typically very wherein the linearisation to LE, LLE, LTSA, respectively obtain LPP (Locality preserving projections)、NPE(Neighborhood Preserving Embedding)、 LLTSA(Linear Local Tangent Space Alignment).
Manifold learning possesses unified framework, for the manifold learning side based on spectral factorization such as such as LE, LLE and LTSA For method, dimensionality reduction result Y of optimum*Can be solved by following optimization problem:
Wherein tr () is the mark operator seeking matrix, and L is Laplacian Matrix, and B is constraint matrix.LPP, NPE and LLTSA assumes that one overall Linear Mapping of presence in reduction process:
Y=VTX (2),
Optimization problem in formula (1) is changed into:
Obtain the Linear Mapping matrix V of optimum*Afterwards, whether new sample or old sample, can pass through Formula (2) is mapped to dimension reduction space.But this hypothesis of formula (2) is invalid many times.In order to obtain the linear of the overall situation Mapping, these methods are that the local retention performance to script method has done certain compromise in fact.
Content of the invention
The present invention is directed to the deficiency of manifold learning no generalization ability it is proposed that a kind of improved manifold learning linearisation Method, and use it for the feature extracting method of high spectrum image.
The purpose of the present invention is achieved through the following technical solutions:
So that linearizing process must not sacrifice the ability that former manifold learning keeps local characteristicses as far as possible, this Invention adds the penalty term deviateing former manifold learning result in original cost function so that optimization in formula (3) Problem is changed into:
Wherein YLThe dimensionality reduction result having learnt, α is penalty coefficient, can by LE, LLE, LTSA or other But a kind of what respond well manifold learning not possessing generalization ability obtains.Compared to formula (3), formula (4) is no longer necessary to Add bound term VTXBXTV=I, this makes the region of search of optimization problem be widened, it is possible to obtain than traditional dimension reduction method more Good dimensionality reduction effect.This extensive method is based on YLKnown go to ask for the process of a linearisation mapping matrix V, be a kind of complete The linear regression method of office.Original manifold learning is all the dimensionality reduction result obtaining optimum by feature decomposition, this The improved method of bright proposition obtains the dimensionality reduction result of optimum by solving a matrix equation.
Here provide the solution procedure to the optimization problem in (4):
Cost function g (V) obtains to V derivation:
Wherein A=α I+L.
Then the optimization problem in formula (4) is changed into the problem of a solution matrix equation:
[XAXT] V=α XYL T(7).
It is equivalent to following system of linear equations problem:
So having:
Formula (9) is a function expression, and wherein vec () is matrix-vector operator, for vec (B), will D The matrix conversion of × d dimension is D × d dimensional vector.Avec () is moment of a vector array operator.For the matrix of D × d dimension, vec The operational rule of () is the column vector that the element of every for matrix a line is converted to d dimension, and the row after every row is changed Vector line up D × d dimensional vector according to line number, avec () is then its inverse operation, will D × d dimension column vector be converted to D × d ties up matrix.
To (9), abbreviation obtains further:
V=(XAXT)-1B (10).
The present invention provide based on manifold learning linearizing high spectrum image feature extracting method, comprise the following steps that:
Step one:Calculate preliminary dimensionality reduction result and Laplacian Matrix.
Given high-spectral data collection X, X are D × N-dimensional matrixes, and D is data dimension, and N is number of samples, and d is dimensionality reduction dimension, A certain manifold learning in execution such as LE, LLE and LTSA method, obtains preliminary dimensionality reduction result Y of this methodLWith Laplacian Matrix L, wherein YLIt is d × N-dimensional matrix, L is N × N-dimensional matrix.
Step 2:Build Matrix division constant term matrix and coefficient matrix.
The linearizing manifold learning such as LPP, NPE and LLTSA generally will be by the Laplacian Matrix L spy to N × N-dimensional Levy to decompose to obtain the Feature Conversion matrix of optimum, computationally intensive, the innovative point of this step is optimal characteristics transition matrix Solution be converted to the Solve problems of a Matrix division.
Shown in Matrix division such as formula (7), provide the structure formula of constant term matrix and coefficient matrix here:
1) build the Matrix division constant term matrix B of D × d dimension:
B=α XYL T,
Wherein α is the penalty coefficient that is positive number, and acquiescence may be configured as 1.
2) build the Matrix division coefficient matrix C of D × D dimension:
C=X (α I+L) XT,
Wherein, I is the unit matrix of N × N-dimensional.
Step 3:Calculate Feature Conversion matrix.
In the solution procedure of Matrix division, generally matrix to be carried out with vectorization and matrixing fortune is carried out to vector Calculate, and also relate to existence and unigueness, these computings make the internal memory of the solution time big consumption of Matrix division Space is also big.The innovative point of this step is that the problem reduction solving Matrix division to be realized for two matrix multiples, real Now facilitate, method complexity is little.
1) equation system matrix number C is carried out inverting obtaining matrix H:
H=C-1
2) pass through matrix H Feature Conversion matrix V mutually multiplied with equation group constant term matrix B:
V=HB.
Step 4:By Feature Conversion matrix calculus final dimensionality reduction result:
Y=VTX,
Wherein, Y is final dimensionality reduction result.
The beneficial effects of the present invention is:
1st, this method is a kind of linear regression method of the overall situation, so that any one does not possess the stream of generalization ability After shape learning method carries out linearisation by the method, obtain generalization ability, and can obtain than traditional dimension reduction method more Good feature extraction effect.
2nd, the hypothesis being directed to overall Linear Mapping in LPP, NPE and LLTSA linearisation manifold learning is many times Invalid deficiency, adds the penalty term deviateing former manifold learning result in original cost function, and casts out Bound term in former object function, this makes the region of search of optimization problem be expanded, it is possible to obtain than traditional dimensionality reduction side Method more preferable dimensionality reduction effect.
Brief description
Fig. 1 is flow chart of steps of the present invention;
Fig. 2 is the feature scatterplot of the hyperspectral image data in the present invention;
Fig. 3 be based on the feature extraction of the present invention after high spectrum image nicety of grading.
Specific embodiment
Below in conjunction with the accompanying drawings technical scheme is further described, but does not limit to so, every to this Inventive technique scheme is modified or equivalent, without deviating from the spirit and scope of technical solution of the present invention, all should cover In protection scope of the present invention.
The present invention needs to obtain Laplacian Matrix with just using a kind of existing manifold learning in the first step The dimensionality reduction result of step, here taking LLE method as a example, as the manifold learning in first step, then using this Bright proposed method carries out feature extraction to high spectrum image.Experiment selects IND PINE EO-1 hyperion with hyperspectral image data Image, this high-spectrum seems to be taken pictures in a farmland to Indiana, USA by the Kennedy Space Center of the U.S. And obtain, in diagram picture, one has 16 kinds of different crops, and the spatial resolution of image is 20 × 20m2, each pixel has 224 wave bands, cover the spectral range that spectral region is 0.2~2.4 μm, and spectral resolution is 10nm.
From IND PINE high spectrum image, randomly select 1500 pixels as training sample, randomly from residue Sample in select 1500 pixels as test sample.In the present embodiment, using the present invention, training sample is learnt, Obtain Feature Conversion matrix, then using Feature Conversion matrix, feature extraction is carried out to test sample, and use KNN grader Test sample is classified.In order to verify feature of present invention extract ground effectiveness, to same training data, test data and Grader, simultaneously using the linearization technique NPE method as a comparison that LLE is classical.
As shown in figure 1, comprising the following steps that of feature extraction is carried out using the present invention:
Step one:Dimensionality reduction result Y is obtained by LLE manifold learningLWith Laplacian Matrix L.
1) input training setWherein each sample data dimension is 224, test specimens This number is 1500;The number in setting neighbour domain is 20;Setting dimensionality reduction result dimension 30.
2) find neighborhood collection
To each sample x in training seti, wherein i represents xiLocation index number in training sample set X, i=1, 2 ..., 1500, by sample between Euclidean distance sequence, search training set in xi20 nearest samples, constitute sample xi's Neighborhood collection
3) build reconstruction coefficients matrix W:
Wherein:WijFor xjTo xiReconstruction coefficients, xjIt is the test sample of j for position call number in training sample set X, its In, T (xj,Xi) calculated by formula below:
Wherein:index(xj,Xi) represent xjIn XiMiddle location index, that T calculates is index (x in matrix Pj,Xi) arrange The sum of all elements;Matrix P is neighborhood collection XiLocal covariance matrix inverse, can be calculated by formula below:
P=[(Xi-xi)T(Xi-xi)]-1,
Wherein:S is the sum of all elements of matrix P, can be calculated by formula below:
4) build Laplacian Matrix:
L=(I-W)T(I-W),
Wherein:I is N × N-dimensional unit matrix.
5) carry out feature decomposition:
Lz=λ z.
6) obtain preliminary dimensionality reduction result:
Wherein:ziiIt is the little corresponding characteristic vector of eigenvalue of i-th i of L, ii=2 ..., d+1.
Step 2:Build Matrix division constant term matrix B and coefficient matrix C.
1) positive number penalty coefficient α is set, is set to α=1 here.
2) Matrix division constant term matrix B is calculated by formula below:
B=α XYL T.
3) Matrix division coefficient matrix C is calculated by formula below:
C=X (α I+L) XT.
Step 3:Calculate Feature Conversion matrix V.
1) equation system matrix number C is carried out inverting obtaining matrix H:
H=C-1.
2) pass through matrix H Feature Conversion matrix V mutually multiplied with equation group constant term matrix B:
V=HB.
Step 4:By Feature Conversion matrix calculus final dimensionality reduction result.
1) final dimensionality reduction result Y of training set is calculated by formula below:
Y=(V)TX.
2) the final dimensionality reduction result of test set is calculated by formula below
Wherein:For test sample collection.
The final dimensionality reduction result of test set and training set is as shown in Figure 2.The extensive result of new samples as we can see from the figure Basically identical with the dimensionality reduction result of test sample it is known that the extensive effect of the carried feature extracting method of the present invention is preferable.
Step 5:Using KNN sorting technique, class test is carried out to test sample.
1) calculate the final dimensionality reduction result of test sample collectionIn each test sample and all training sample sets final The Euclidean distance of dimensionality reduction result Y.
2) take closest 5 training sample as the neighbour of test sample.
3) primary categories according to this 5 neighbour's ownership, test sample is classified.
4) calculate the accuracy of classification.
Classification accuracy rate is as shown in Figure 3.As can see from Figure 3 bloom is carried out by the carried feature extracting method of the present invention The extensive of spectrogram picture has higher overall nicety of grading by KNN classification afterwards, and the nicety of grading of LLE-GLR method substantially will Higher than the nicety of grading of LLE, NPE method, the feature extracting method proposing in this explanation present invention is conducive to improving high-spectrum The nicety of grading of picture.
The above analysis it is known that proposed by the present invention based on manifold learning linearizing high spectrum image feature extraction side Method, is improved really to existing manifold learning linearization technique.The feature extracting method proposing in the present invention can have The classification essence solving the deficiency that existing manifold learning cannot learn to new samples, high spectrum image can also being improved simultaneously of effect Degree, has very big engineering real value.

Claims (1)

1. a kind of based on manifold learning linearizing high spectrum image feature extracting method it is characterised in that described high spectrum image Feature extracting method step is as follows:
First, give high-spectral data collection X, preliminary dimensionality reduction result Y is obtained by manifold learningLWith Laplacian Matrix L, its Middle X is D × N-dimensional matrix, and D is data dimension, and N is number of samples, YLIt is d × N-dimensional matrix, L is N × N-dimensional matrix, and d is dimensionality reduction Dimension;
2nd, Matrix division constant term matrix B and coefficient matrix C are built:
1) build the Matrix division constant term matrix B of D × d dimension:
B=α XYL T,
Wherein, α is the penalty coefficient that is positive number;
2) build the Matrix division coefficient matrix C of D × D dimension:
C=X (α I+L) XT,
Wherein, I is the unit matrix of N × N-dimensional;
3rd, calculate Feature Conversion matrix:
1) Matrix division coefficient matrix C is carried out inverting obtaining matrix H:
H=C-1
2) pass through matrix H Feature Conversion matrix V mutually multiplied with Matrix division constant term matrix B:
V=HB;
4th, pass through Feature Conversion matrix calculus final dimensionality reduction result:
Y=VTX,
Wherein, Y is final dimensionality reduction result.
CN201410286545.3A 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method Active CN104008383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410286545.3A CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410286545.3A CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Publications (2)

Publication Number Publication Date
CN104008383A CN104008383A (en) 2014-08-27
CN104008383B true CN104008383B (en) 2017-03-08

Family

ID=51369032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410286545.3A Active CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Country Status (1)

Country Link
CN (1) CN104008383B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107203779A (en) * 2017-05-11 2017-09-26 中国科学院西安光学精密机械研究所 The EO-1 hyperion dimension reduction method kept based on empty spectrum information
CN112016366A (en) * 2019-05-31 2020-12-01 北京车和家信息技术有限公司 Obstacle positioning method and device
CN110222631B (en) * 2019-06-04 2022-03-15 电子科技大学 Data block parallel based cut space arrangement SAR image target identification method
CN110781974A (en) * 2019-10-31 2020-02-11 上海融军科技有限公司 Dimension reduction method and system for hyperspectral image

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136736A (en) * 2013-03-19 2013-06-05 哈尔滨工业大学 Hyperspectral remote sensing data non-linear dimension descending method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7907777B2 (en) * 2005-01-25 2011-03-15 Siemens Medical Solutions Usa, Inc. Manifold learning for discriminating pixels in multi-channel images, with application to image/volume/video segmentation and clustering

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136736A (en) * 2013-03-19 2013-06-05 哈尔滨工业大学 Hyperspectral remote sensing data non-linear dimension descending method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Hyperspectral image classification with multivariate empirical mode decomposition-based features;Zhi He et al.;《2014 IEEE International Instrumentation and Measurement Technology Conference》;20140515;第999-1004页 *
基于流形学习的新高光谱图像降维算法;普晗晔 等;《红外与激光工程》;20140131;第43 卷(第1期);第232-237页 *

Also Published As

Publication number Publication date
CN104008383A (en) 2014-08-27

Similar Documents

Publication Publication Date Title
CN110321963B (en) Hyperspectral image classification method based on fusion of multi-scale and multi-dimensional space spectrum features
CN109064396A (en) A kind of single image super resolution ratio reconstruction method based on depth ingredient learning network
Chen et al. Recursive context routing for object detection
Dai et al. RADANet: Road augmented deformable attention network for road extraction from complex high-resolution remote-sensing images
CN104008383B (en) Based on manifold learning linearizing high spectrum image feature extracting method
Xing et al. Dual-collaborative fusion model for multispectral and panchromatic image fusion
CN112215267B (en) Hyperspectral image-oriented depth space spectrum subspace clustering method
CN109034213B (en) Hyperspectral image classification method and system based on correlation entropy principle
Wang et al. Urban building extraction from high-resolution remote sensing imagery based on multi-scale recurrent conditional generative adversarial network
Pérez et al. Deepcoast: Quantifying seagrass distribution in coastal water through deep capsule networks
CN113066037A (en) Multispectral and full-color image fusion method and system based on graph attention machine system
Chen et al. MICU: Image super-resolution via multi-level information compensation and U-net
Deb et al. LS-Net: A convolutional neural network for leaf segmentation of rosette plants
Jiang et al. Tabcellnet: Deep learning-based tabular cell structure detection
Ma et al. A multimodal hyper-fusion transformer for remote sensing image classification
Perveen et al. [Retracted] Multidimensional Attention‐Based CNN Model for Identifying Apple Leaf Disease
CN114663777A (en) Hyperspectral image change detection method based on spatio-temporal joint graph attention mechanism
Wu et al. A distributed fusion framework of multispectral and panchromatic images based on residual network
CN116563649B (en) Tensor mapping network-based hyperspectral image lightweight classification method and device
CN116310452B (en) Multi-view clustering method and system
CN104050482A (en) Manifold learning generic algorithm based on local linear regression
CN109460788B (en) Hyperspectral image classification method based on low-rank-sparse information combination network
Atkale et al. Multi-scale feature fusion model followed by residual network for generation of face aging and de-aging
CN116630964A (en) Food image segmentation method based on discrete wavelet attention network
Huang et al. Image Inpainting with Bilateral Convolution

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant